You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/04/04 18:24:29 UTC

Hadoop-Mapreduce-trunk - Build # 3150 - Still Failing

See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3150/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.598 sec - in org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Running org.apache.hadoop.mapred.pipes.TestPipeApplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.941 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.655 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.776 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.373 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.809 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.875 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.159 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-04T16:24:20+00:00
[INFO] Final Memory: 33M/607M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3328 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3328/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 315 lines...]
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop MapReduce 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... FAILURE [  2.955 s]
[INFO] Apache Hadoop MapReduce Core ...................... SKIPPED
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.522 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4.921 s
[INFO] Finished at: 2016-05-17T01:30:54+00:00
[INFO] Final Memory: 23M/723M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk - Build # 3327 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3327/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 6054 lines...]
and
+-org.apache.hadoop:hadoop-mapreduce:3.0.0-alpha1-SNAPSHOT
  +-org.apache.hadoop:hadoop-annotations:3.0.0-alpha1-20160516.224859-15
and
+-org.apache.hadoop:hadoop-mapreduce:3.0.0-alpha1-SNAPSHOT
  +-org.apache.hadoop:hadoop-common:3.0.0-alpha1-20160516.224934-10
    +-org.apache.hadoop:hadoop-annotations:3.0.0-alpha1-SNAPSHOT

[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for org.apache.hadoop:hadoop-annotations:3.0.0-alpha1-SNAPSHOT paths to dependency are:
+-org.apache.hadoop:hadoop-mapreduce:3.0.0-alpha1-SNAPSHOT
  +-org.apache.hadoop:hadoop-common:3.0.0-alpha1-20160516.224934-10
    +-org.apache.hadoop:hadoop-annotations:3.0.0-alpha1-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce:3.0.0-alpha1-SNAPSHOT
  +-org.apache.hadoop:hadoop-annotations:3.0.0-alpha1-20160516.224859-15
and
+-org.apache.hadoop:hadoop-mapreduce:3.0.0-alpha1-SNAPSHOT
  +-org.apache.hadoop:hadoop-common:3.0.0-alpha1-20160516.224934-10
    +-org.apache.hadoop:hadoop-annotations:3.0.0-alpha1-SNAPSHOT
]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... FAILURE [  8.146 s]
[INFO] Apache Hadoop MapReduce Core ...................... SKIPPED
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... FAILURE [  0.516 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.439 s
[INFO] Finished at: 2016-05-17T00:33:29+00:00
[INFO] Final Memory: 29M/913M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk - Build # 3326 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3326/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 316 lines...]
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop MapReduce 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... FAILURE [  3.147 s]
[INFO] Apache Hadoop MapReduce Core ...................... SKIPPED
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.486 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.196 s
[INFO] Finished at: 2016-05-16T23:06:31+00:00
[INFO] Final Memory: 23M/723M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk - Build # 3325 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3325/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32703 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.458 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:09 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.947 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.675 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:45 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:04 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.653 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [12:08 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.302 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:18 h
[INFO] Finished at: 2016-05-16T22:40:51+00:00
[INFO] Final Memory: 35M/599M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3324 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3324/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32686 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.861 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.769 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.031 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:32 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:51 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.461 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:48 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.244 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-05-16T19:05:38+00:00
[INFO] Final Memory: 36M/603M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3323 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3323/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32691 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.902 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:58 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.022 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.271 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:49 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:59 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.398 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:48 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.245 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:12 h
[INFO] Finished at: 2016-05-16T10:54:13+00:00
[INFO] Final Memory: 35M/592M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3322 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3322/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32687 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.245 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:03 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.970 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.216 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:34 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:48 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.430 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:14 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.247 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-16T08:13:12+00:00
[INFO] Final Memory: 36M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3321 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3321/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32684 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.864 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.981 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.057 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:27 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:49 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.424 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:33 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.316 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-05-16T04:08:23+00:00
[INFO] Final Memory: 35M/598M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3320 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3320/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31450 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.783 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.388 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 min
[INFO] Finished at: 2016-05-14T19:58:39+00:00
[INFO] Final Memory: 31M/697M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk - Build # 3319 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3319/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31450 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.849 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.394 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:54 min
[INFO] Finished at: 2016-05-13T21:58:52+00:00
[INFO] Final Memory: 31M/697M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk - Build # 3318 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3318/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32654 lines...]
[INFO] Building Apache Hadoop MapReduce 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.345 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:02 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.964 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.463 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:06 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.748 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:17 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.267 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:29 h
[INFO] Finished at: 2016-05-13T21:10:05+00:00
[INFO] Final Memory: 39M/592M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3317 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3317/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32659 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.874 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.028 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.025 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:26 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:49 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.439 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:02 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.249 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-05-13T18:06:36+00:00
[INFO] Final Memory: 36M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3316 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3316/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32662 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.783 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.236 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.977 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:49 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.417 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:02 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.241 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:10 h
[INFO] Finished at: 2016-05-13T13:07:42+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3315 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3315/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32657 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.865 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:56 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.563 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.237 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:39 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:49 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.409 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:46 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.250 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-05-13T10:11:28+00:00
[INFO] Final Memory: 35M/592M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3314 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3314/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31426 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.882 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.360 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 min
[INFO] Finished at: 2016-05-13T02:57:26+00:00
[INFO] Final Memory: 35M/899M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk - Build # 3313 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3313/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32698 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.852 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.105 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.141 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:27 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:51 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.407 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:45 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.251 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-13T02:28:07+00:00
[INFO] Final Memory: 37M/703M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3312 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3312/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32157 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-alpha1-SNAPSHOT/hadoop-mapreduce-3.0.0-alpha1-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.901 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.824 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.305 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [08:27 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.561 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:22 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.252 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 20:23 min
[INFO] Finished at: 2016-05-12T23:12:50+00:00
[INFO] Final Memory: 36M/691M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Error Message:
Timed out waiting for Mini HDFS Cluster to start

Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitClusterUp(MiniDFSCluster.java:1345)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:848)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.setUpClass(TestJobHistoryEventHandler.java:93)


FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.cleanUpClass(TestJobHistoryEventHandler.java:98)




Hadoop-Mapreduce-trunk - Build # 3311 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3311/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32661 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.923 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:29 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.372 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.938 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:08 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  2.456 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [13:59 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.540 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:42 h
[INFO] Finished at: 2016-05-12T21:23:23+00:00
[INFO] Final Memory: 35M/603M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestNNBench.testNNBenchCreateReadAndDelete

Error Message:
create_write should create the file

Stack Trace:
java.lang.AssertionError: create_write should create the file
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.hdfs.TestNNBench.testNNBenchCreateReadAndDelete(TestNNBench.java:55)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3310 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3310/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32661 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.824 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.702 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.307 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:44 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:50 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.608 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:39 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.238 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-05-12T16:26:47+00:00
[INFO] Final Memory: 35M/598M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3309 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3309/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32664 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.198 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.688 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.738 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:43 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:49 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.460 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:50 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.251 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-12T13:25:13+00:00
[INFO] Final Memory: 36M/733M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3308 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3308/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32654 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.735 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:03 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.280 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.176 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.490 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:41 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.260 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:14 h
[INFO] Finished at: 2016-05-12T06:01:13+00:00
[INFO] Final Memory: 35M/594M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3307 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3307/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32738 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.447 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.379 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.534 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:37 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:50 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.503 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:00 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.259 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-12T02:16:16+00:00
[INFO] Final Memory: 37M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3306 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3306/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32405 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.006 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:21 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.601 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.376 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  2.068 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:40 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.329 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:48 min
[INFO] Finished at: 2016-05-11T20:25:20+00:00
[INFO] Final Memory: 37M/827M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
16 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLine

Error Message:
org/apache/hadoop/security/authentication/server/AuthenticationFilter

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/authentication/server/AuthenticationFilter
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:454)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:346)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:109)
	at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:291)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:276)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:345)
	at org.apache.hadoop.mapreduce.v2.app.client.MRClientService.serviceStart(MRClientService.java:143)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1223)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:301)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLine(TestMapReduceChildJVM.java:54)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateShuffle

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:108)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateShuffle(TestMapReduceChildJVM.java:87)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLineWithLog4JConifg

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLineWithLog4JConifg(TestMapReduceChildJVM.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:108)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:102)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSizes

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSize(TestMapReduceChildJVM.java:227)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSizes(TestMapReduceChildJVM.java:183)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testEnvironmentVariables

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testEnvironmentVariables(TestMapReduceChildJVM.java:281)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateCRLAShuffle

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:108)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateCRLAShuffle(TestMapReduceChildJVM.java:96)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttemptContainerRequest.testAttemptContainerRequest

Error Message:
org/apache/hadoop/fs/Path

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Path
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttemptContainerRequest.testAttemptContainerRequest(TestTaskAttemptContainerRequest.java:82)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdXMLState

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdXMLState(TestAMWebServicesAttempt.java:185)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdState

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdState(TestAMWebServicesAttempt.java:224)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdState

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdState(TestAMWebServicesAttempt.java:157)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdXMLState

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdXMLState(TestAMWebServicesAttempt.java:257)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConf

Error Message:
org/apache/hadoop/yarn/api/records/YarnApplicationState

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/api/records/YarnApplicationState
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.yarn.MockApps.<clinit>(MockApps.java:37)
	at org.apache.hadoop.mapreduce.v2.app.MockAppContext.<init>(MockAppContext.java:38)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf$1.configureServlets(TestAMWebServicesJobConf.java:117)
	at com.google.inject.servlet.ServletModule.configure(ServletModule.java:53)
	at com.google.inject.AbstractModule.configure(AbstractModule.java:59)
	at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223)
	at com.google.inject.spi.Elements.getElements(Elements.java:101)
	at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133)
	at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103)
	at com.google.inject.Guice.createInjector(Guice.java:95)
	at com.google.inject.Guice.createInjector(Guice.java:72)
	at com.google.inject.Guice.createInjector(Guice.java:62)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)




Hadoop-Mapreduce-trunk - Build # 3305 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3305/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32648 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.052 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:26 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 37.730 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.094 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:08 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.799 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [10:52 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.334 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:39 h
[INFO] Finished at: 2016-05-11T17:35:18+00:00
[INFO] Final Memory: 36M/703M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3304 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3304/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32132 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.708 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:22 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.415 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.405 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:00 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.813 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [10:59 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.313 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:14 min
[INFO] Finished at: 2016-05-11T08:50:29+00:00
[INFO] Final Memory: 37M/828M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)




Hadoop-Mapreduce-trunk - Build # 3303 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3303/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32648 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.875 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:23 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.018 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.066 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:02 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.920 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:04 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.297 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:33 h
[INFO] Finished at: 2016-05-11T05:14:06+00:00
[INFO] Final Memory: 36M/720M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3302 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3302/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32673 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.863 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.220 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.106 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:30 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:52 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.465 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:44 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.252 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-11T00:16:13+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomAbort

Error Message:
Job status not available 

Stack Trace:
java.io.IOException: Job status not available 
	at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:331)
	at org.apache.hadoop.mapreduce.Job.isComplete(Job.java:604)
	at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1400)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1362)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testFailedJob(TestJobOutputCommitter.java:174)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomAbort(TestJobOutputCommitter.java:256)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3301 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3301/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 44145 lines...]
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.955 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:57 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.114 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.338 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:26 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.466 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ FAILURE [09:06 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.294 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:27 h
[INFO] Finished at: 2016-05-10T20:37:24+00:00
[INFO] Final Memory: 34M/603M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-nativetask: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-nativetask/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.conf.TestNoDefaultsJobConf.testNoDefaults

Error Message:
org/apache/hadoop/test/GenericTestUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/test/GenericTestUtils
	at org.apache.hadoop.util.JarFinder.getJar(JarFinder.java:157)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:67)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.test.GenericTestUtils
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.util.JarFinder.getJar(JarFinder.java:157)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:67)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.conf.TestNoDefaultsJobConf.testNoDefaults

Error Message:
org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator
	at org.apache.hadoop.util.IntrusiveCollection.iterator(IntrusiveCollection.java:213)
	at org.apache.hadoop.util.IntrusiveCollection.clear(IntrusiveCollection.java:368)
	at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.clearPendingCachingCommands(DatanodeManager.java:1721)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1244)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1589)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:814)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:993)
	at org.apache.hadoop.hdfs.MiniDFSCluster.stopAndJoinNameNode(MiniDFSCluster.java:1965)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1911)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1882)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1875)
	at org.apache.hadoop.mapred.HadoopTestCase.tearDown(HadoopTestCase.java:178)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapred.nativetask.kvtest.LargeKVTest.testKeySize

Error Message:
org/apache/hadoop/yarn/util/Apps

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/Apps
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:92)
	at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:172)
	at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:788)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:244)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1755)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1359)
	at org.apache.hadoop.mapred.nativetask.kvtest.KVJob.runJob(KVJob.java:108)
	at org.apache.hadoop.mapred.nativetask.kvtest.LargeKVTest.runKVSizeTests(LargeKVTest.java:109)
	at org.apache.hadoop.mapred.nativetask.kvtest.LargeKVTest.testKeySize(LargeKVTest.java:62)


FAILED:  org.apache.hadoop.mapred.nativetask.kvtest.LargeKVTest.testValueSize

Error Message:
org/apache/hadoop/mapreduce/v2/util/MRApps

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/util/MRApps
	at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:92)
	at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:172)
	at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:788)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:244)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1755)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1359)
	at org.apache.hadoop.mapred.nativetask.kvtest.KVJob.runJob(KVJob.java:108)
	at org.apache.hadoop.mapred.nativetask.kvtest.LargeKVTest.runKVSizeTests(LargeKVTest.java:109)
	at org.apache.hadoop.mapred.nativetask.kvtest.LargeKVTest.testValueSize(LargeKVTest.java:67)




Hadoop-Mapreduce-trunk - Build # 3300 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3300/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32241 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.965 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:30 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 55.593 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 11.068 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:49 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [07:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  2.316 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:29 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.338 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 36:49 min
[INFO] Finished at: 2016-05-10T09:05:51+00:00
[INFO] Final Memory: 38M/851M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting

Error Message:
test timed out after 1000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 1000 milliseconds
	at org.mockito.asm.MethodWriter.visitMethodInsn(MethodWriter.java:809)
	at org.mockito.cglib.core.CodeEmitter.emit_invoke(CodeEmitter.java:501)
	at org.mockito.cglib.core.CodeEmitter.super_invoke(CodeEmitter.java:480)
	at org.mockito.cglib.proxy.MethodInterceptorGenerator.superHelper(MethodInterceptorGenerator.java:144)
	at org.mockito.cglib.proxy.MethodInterceptorGenerator.generate(MethodInterceptorGenerator.java:104)
	at org.mockito.cglib.proxy.Enhancer.emitMethods(Enhancer.java:948)
	at org.mockito.cglib.proxy.Enhancer.generateClass(Enhancer.java:499)
	at org.mockito.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
	at org.mockito.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:217)
	at org.mockito.cglib.proxy.Enhancer.createHelper(Enhancer.java:378)
	at org.mockito.cglib.proxy.Enhancer.createClass(Enhancer.java:318)
	at org.mockito.internal.creation.jmock.ClassImposterizer.createProxyClass(ClassImposterizer.java:93)
	at org.mockito.internal.creation.jmock.ClassImposterizer.imposterise(ClassImposterizer.java:50)
	at org.mockito.internal.util.MockUtil.createMock(MockUtil.java:54)
	at org.mockito.internal.MockitoCore.mock(MockitoCore.java:45)
	at org.mockito.Mockito.mock(Mockito.java:921)
	at org.mockito.Mockito.mock(Mockito.java:816)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting(TestJobListCache.java:39)




Hadoop-Mapreduce-trunk - Build # 3299 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3299/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32647 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.367 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:12 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 33.377 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.069 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:03 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.435 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:18 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.238 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:15 h
[INFO] Finished at: 2016-05-10T02:10:39+00:00
[INFO] Final Memory: 35M/602M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3298 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3298/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32197 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  6.423 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:31 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.899 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.380 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  2.171 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:43 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.309 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:29 min
[INFO] Finished at: 2016-05-09T00:32:35+00:00
[INFO] Final Memory: 38M/838M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)




Hadoop-Mapreduce-trunk - Build # 3297 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3297/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 44013 lines...]
[INFO] Building Apache Hadoop MapReduce 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.397 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:04 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.363 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.681 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:54 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.518 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:30 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.244 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:27 h
[INFO] Finished at: 2016-05-08T20:26:02+00:00
[INFO] Final Memory: 36M/750M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3296 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3296/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32143 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.935 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:24 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.621 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.812 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:04 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.934 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:57 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.323 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:18 min
[INFO] Finished at: 2016-05-07T10:11:00+00:00
[INFO] Final Memory: 37M/828M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<SCHEDULED>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<SCHEDULED>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)




Hadoop-Mapreduce-trunk - Build # 3295 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3295/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32647 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.806 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:25 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.951 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.709 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:31 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:04 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  2.015 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:18 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.306 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:35 h
[INFO] Finished at: 2016-05-07T06:51:00+00:00
[INFO] Final Memory: 36M/733M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3294 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3294/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32647 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.764 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 40.179 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.975 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:36 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:05 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  2.303 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:43 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.288 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:38 h
[INFO] Finished at: 2016-05-07T01:35:55+00:00
[INFO] Final Memory: 35M/720M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3293 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3293/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32653 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.916 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:24 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.583 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.524 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:22 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:33 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.806 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:31 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.360 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:32 h
[INFO] Finished at: 2016-05-06T11:22:37+00:00
[INFO] Final Memory: 35M/747M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestTextInputFormat.testSplitableCodecs

Error Message:
Key in multiple partitions.

Stack Trace:
java.lang.AssertionError: Key in multiple partitions.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertFalse(Assert.java:64)
	at org.apache.hadoop.mapred.TestTextInputFormat.testSplitableCodecs(TestTextInputFormat.java:223)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3292 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3292/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32647 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.843 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:56 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.845 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.986 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:31 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:45 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.421 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:36 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.241 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-05-06T07:19:38+00:00
[INFO] Final Memory: 35M/598M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3291 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3291/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32656 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.266 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.651 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.150 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:35 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.462 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:38 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.266 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:12 h
[INFO] Finished at: 2016-05-06T04:40:19+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3290 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3290/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32655 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.926 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.877 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.249 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:30 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.656 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:46 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.247 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-05-06T02:00:06+00:00
[INFO] Final Memory: 36M/720M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3289 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3289/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 55474 lines...]
[INFO] Building Apache Hadoop MapReduce 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.842 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.113 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.080 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:35 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:44 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:54 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.538 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:51 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.267 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:19 h
[INFO] Finished at: 2016-05-05T22:54:45+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3288 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3288/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32651 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.991 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.334 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.243 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:36 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.427 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:08 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.232 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-05-05T17:07:26+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3287 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3287/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32651 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.920 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.435 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.161 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.410 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:57 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.246 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-05T10:52:27+00:00
[INFO] Final Memory: 35M/603M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3286 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3286/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32650 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.130 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.466 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.457 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:30 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.563 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:33 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.250 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-05T08:11:27+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk - Build # 3285 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3285/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32686 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.076 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.645 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.916 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:43 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:42 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.463 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:10 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.243 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:14 h
[INFO] Finished at: 2016-05-05T05:03:37+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3284 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3284/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32734 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.881 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.570 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.043 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:38 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.448 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:12 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.238 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-05T02:21:47+00:00
[INFO] Final Memory: 37M/607M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3283 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3283/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32688 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.157 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.920 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.038 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:38 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:42 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.406 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:53 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.244 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:12 h
[INFO] Finished at: 2016-05-04T23:22:47+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3282 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3282/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32676 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.203 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:01 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.452 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.331 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:54 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:51 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:53 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.449 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:02 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.239 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:20 h
[INFO] Finished at: 2016-05-04T20:18:25+00:00
[INFO] Final Memory: 35M/607M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3281 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3281/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32677 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.863 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.282 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.072 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.399 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:18 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.246 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-05-04T17:06:35+00:00
[INFO] Final Memory: 35M/598M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3280 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3280/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32679 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.788 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.516 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.105 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:28 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.436 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:34 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.244 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:14 h
[INFO] Finished at: 2016-05-04T11:04:42+00:00
[INFO] Final Memory: 35M/598M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3279 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3279/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32678 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.208 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.744 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.996 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:46 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.390 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:12 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.234 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:14 h
[INFO] Finished at: 2016-05-04T08:13:26+00:00
[INFO] Final Memory: 35M/733M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3278 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3278/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32133 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.092 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.580 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.067 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [08:40 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.596 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:38 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.236 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 20:52 min
[INFO] Finished at: 2016-05-04T01:18:57+00:00
[INFO] Final Memory: 35M/691M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)




Hadoop-Mapreduce-trunk - Build # 3277 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3277/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32678 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.892 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.290 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.094 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:33 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.391 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:41 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.263 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-05-03T23:06:56+00:00
[INFO] Final Memory: 36M/736M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3276 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3276/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32679 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.989 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.360 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.119 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:31 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:51 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.401 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:55 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.251 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:16 h
[INFO] Finished at: 2016-05-03T20:25:23+00:00
[INFO] Final Memory: 35M/607M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3275 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3275/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32684 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.308 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:06 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.750 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.472 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:54 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:52 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.432 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [10:43 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.270 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:15 h
[INFO] Finished at: 2016-05-03T14:14:06+00:00
[INFO] Final Memory: 36M/720M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3274 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3274/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32674 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  7.172 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [01:03 min]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 13.930 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:42 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:20 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:06 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.743 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:18 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.332 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:40 h
[INFO] Finished at: 2016-05-03T10:39:04+00:00
[INFO] Final Memory: 37M/1044M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3273 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3273/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32674 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.144 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.983 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.210 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:27 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.451 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.285 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:14 h
[INFO] Finished at: 2016-05-03T07:07:09+00:00
[INFO] Final Memory: 35M/602M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3272 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3272/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32681 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.807 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:29 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 38.169 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.605 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:30 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:57 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  2.085 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:25 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.304 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:29 h
[INFO] Finished at: 2016-05-03T04:01:42+00:00
[INFO] Final Memory: 35M/598M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled

Error Message:
Job didn't finish in 30 seconds

Stack Trace:
java.io.IOException: Job didn't finish in 30 seconds
	at org.apache.hadoop.mapred.UtilsForTests.runJobSucceed(UtilsForTests.java:622)
	at org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled(TestMRTimelineEventHandling.java:172)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3271 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3271/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32673 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.964 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.942 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.031 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:35 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.455 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:28 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.240 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:17 h
[INFO] Finished at: 2016-04-30T22:15:48+00:00
[INFO] Final Memory: 35M/602M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3270 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3270/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32674 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.077 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.331 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.191 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:30 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.495 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:03 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.248 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-30T09:07:13+00:00
[INFO] Final Memory: 35M/603M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3269 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3269/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32671 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.911 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:20 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 34.433 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.391 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:37 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:25 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.707 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:46 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.360 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:31 h
[INFO] Finished at: 2016-04-30T01:04:32+00:00
[INFO] Final Memory: 35M/747M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3268 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3268/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32320 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.112 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.160 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.324 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [05:17 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.577 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:58 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.239 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:16 min
[INFO] Finished at: 2016-04-29T21:22:23+00:00
[INFO] Final Memory: 37M/644M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
8 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testStartStopServer

Error Message:
org/apache/hadoop/security/authorize/AccessControlList

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/authorize/AccessControlList
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.JobACLsManager.<init>(JobACLsManager.java:41)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceInit(HistoryFileManager.java:555)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:96)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testStartStopServer(TestJobHistoryServer.java:78)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testStartStopServer

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.stop(TestJobHistoryServer.java:217)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testReports

Error Message:
org/apache/hadoop/net/DNSToSwitchMapping

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/net/DNSToSwitchMapping
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testReports(TestJobHistoryServer.java:96)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testLaunch

Error Message:
org/apache/hadoop/util/ExitUtil

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ExitUtil
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testLaunch(TestJobHistoryServer.java:204)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttemptsXML

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttemptsXML(TestHsWebServicesJobs.java:747)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttemptsSlash

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttemptsSlash(TestHsWebServicesJobs.java:711)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttempts

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttempts(TestHsWebServicesJobs.java:693)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttemptsDefault

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs.testJobAttemptsDefault(TestHsWebServicesJobs.java:729)




Hadoop-Mapreduce-trunk - Build # 3267 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3267/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32671 lines...]
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.809 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.322 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.993 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:34 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:56 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.553 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:12 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.265 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:21 h
[INFO] Finished at: 2016-04-29T20:29:49+00:00
[INFO] Final Memory: 35M/595M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7855912183543717430.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire2989661095853688921tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2748718700504024234177tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3266 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3266/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32650 lines...]
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.835 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.883 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.075 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:26 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:16 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.666 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.243 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:42 h
[INFO] Finished at: 2016-04-29T17:41:06+00:00
[INFO] Final Memory: 35M/615M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7652958486177092567.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire6972667254435952384tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1845781313100195627439tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath.testJobWithDFS

Error Message:
org/apache/hadoop/yarn/server/MiniYARNCluster

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/server/MiniYARNCluster
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath.testJobWithDFS(TestSpecialCharactersInOutputPath.java:113)


FAILED:  org.apache.hadoop.mapreduce.TestLargeSort.testLargeSort

Error Message:
org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/ResourceLocalizationService$CacheCleanup

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/server/nodemanager/containermanager/localizer/ResourceLocalizationService$CacheCleanup
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.serviceStart(ResourceLocalizationService.java:354)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.serviceStart(ContainerManagerImpl.java:503)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStart(NodeManager.java:386)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$NodeManagerWrapper.serviceStart(MiniYARNCluster.java:579)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapreduce.TestLargeSort.setup(TestLargeSort.java:40)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3265 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3265/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31402 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.834 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:48 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.420 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 min
[INFO] Finished at: 2016-04-29T11:59:07+00:00
[INFO] Final Memory: 30M/697M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk - Build # 3264 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3264/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32624 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.083 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.083 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.072 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:32 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.443 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [08:58 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.255 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-29T10:16:48+00:00
[INFO] Final Memory: 35M/598M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3263 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3263/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32631 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.950 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.294 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.495 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:37 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.410 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [09:14 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.254 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-29T07:08:22+00:00
[INFO] Final Memory: 35M/603M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk - Build # 3262 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3262/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32094 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.1:install (default-install) @ hadoop-mapreduce ---
[INFO] Installing /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/pom.xml to /home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/3.0.0-SNAPSHOT/hadoop-mapreduce-3.0.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.412 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:02 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 32.786 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.127 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [  1.581 s]
[INFO] Apache Hadoop MapReduce NativeTask ................ SUCCESS [11:08 min]
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SUCCESS [  0.271 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23:10 min
[INFO] Finished at: 2016-04-28T22:35:39+00:00
[INFO] Final Memory: 37M/861M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk - Build # 3261 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3261/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32129 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.955 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 78.413 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.445 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.427 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.936 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.781 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.118 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:39 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-28T21:02:12+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3260 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3260/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 35061 lines...]
+-org.apache.hadoop:hadoop-mapreduce-client-hs:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-hdfs:3.0.0-20160428.175418-6565
    +-org.apache.hadoop:hadoop-hdfs-client:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client-hs:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-hdfs:3.0.0-20160428.175418-6565
    +-org.apache.hadoop:hadoop-hdfs-client:3.0.0-SNAPSHOT

[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for org.apache.hadoop:hadoop-hdfs-client:3.0.0-20160428.175413-990 paths to dependency are:
+-org.apache.hadoop:hadoop-mapreduce-client-hs:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-hdfs-client:3.0.0-20160428.175413-990
and
+-org.apache.hadoop:hadoop-mapreduce-client-hs:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-hdfs:3.0.0-20160428.175418-6565
    +-org.apache.hadoop:hadoop-hdfs-client:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client-hs:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-hdfs:3.0.0-20160428.175418-6565
    +-org.apache.hadoop:hadoop-hdfs-client:3.0.0-SNAPSHOT
]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.999 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.800 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.086 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:27 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16:34 min
[INFO] Finished at: 2016-04-28T18:13:53+00:00
[INFO] Final Memory: 38M/649M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client-hs: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk - Build # 3259 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3259/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32130 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.842 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.905 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.623 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.145 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.848 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:02 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.416 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.140 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:37 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-28T03:43:07+00:00
[INFO] Final Memory: 34M/750M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3258 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3258/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32129 lines...]
Running org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.07 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.865 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.511 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.61 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.017 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.377 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.355 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:34 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-28T01:10:15+00:00
[INFO] Final Memory: 34M/724M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3257 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3257/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32140 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.226 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.685 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.268 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.577 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.980 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.032 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.313 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [08:38 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:48 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-27T22:17:00+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3256 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3256/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32129 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.365 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.949 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.321 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.172 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.743 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:18 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 34.143 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.887 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:27 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:05 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:25 h
[INFO] Finished at: 2016-04-27T19:38:48+00:00
[INFO] Final Memory: 35M/1018M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3255 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3255/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32171 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 266.079 sec - in org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter
Running org.apache.hadoop.mapreduce.lib.output.TestMRCJCFileOutputCommitter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.374 sec - in org.apache.hadoop.mapreduce.lib.output.TestMRCJCFileOutputCommitter
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.398 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.912 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.347 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.8 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 521, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.854 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:21 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.246 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.375 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:47 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:25 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:10 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:30 h
[INFO] Finished at: 2016-04-27T16:35:57+00:00
[INFO] Final Memory: 40M/750M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3254 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3254/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32175 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.638 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.908 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.366 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.343 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.384 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:21 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.022 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.067 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:54 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:17 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:03 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:23 h
[INFO] Finished at: 2016-04-27T13:28:48+00:00
[INFO] Final Memory: 35M/762M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3253 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3253/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2929 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.463 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.389 sec - in org.apache.hadoop.mapred.TestMapFileOutputFormat
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.304 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.616 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.089 sec - in org.apache.hadoop.mapred.TestClock
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.878 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [02:01 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 min
[INFO] Finished at: 2016-04-27T06:37:19+00:00
[INFO] Final Memory: 42M/897M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3252 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3252/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32172 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.582 sec - in org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.837 sec - in org.apache.hadoop.mapred.TestReporter
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.954 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.703 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.741 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.175 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 532, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.773 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.661 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.894 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:33 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:59 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:16 h
[INFO] Finished at: 2016-04-27T06:27:57+00:00
[INFO] Final Memory: 37M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3251 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3251/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32133 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.667 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.864 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.392 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.392 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.729 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:17 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 33.890 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.745 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:36 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:17 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:01 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:21 h
[INFO] Finished at: 2016-04-27T03:44:27+00:00
[INFO] Final Memory: 37M/1043M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3250 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3250/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32133 lines...]
Running org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.015 sec - in org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 58.372 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.025 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 154.675 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.857 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:21 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.215 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.026 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:57 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:54 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-04-26T23:40:05+00:00
[INFO] Final Memory: 34M/719M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3249 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3249/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31594 lines...]
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.797 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.09 sec - in org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.408 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.525 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.168 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.855 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:625->assertJobState:1012 expected:<COMMITTING> but was:<ERROR>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.864 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [01:00 min]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 12.541 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [13:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18:53 min
[INFO] Finished at: 2016-04-26T18:27:25+00:00
[INFO] Final Memory: 43M/1247M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<COMMITTING> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<COMMITTING> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:625)



Hadoop-Mapreduce-trunk - Build # 3248 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3248/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32155 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.724 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.985 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.826 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests in error: 
  TestSpeculativeExecution.testSpeculativeExecution:249 » IO Job status not avai...
  TestJobCleanup.testCustomCleanup:321->testKilledJob:233 NullPointer

Tests run: 532, Failures: 5, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.791 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.199 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.239 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:28 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:02 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:20 h
[INFO] Finished at: 2016-04-26T16:33:23+00:00
[INFO] Final Memory: 37M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestJobCleanup.testCustomCleanup

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapred.TestJobCleanup.testKilledJob(TestJobCleanup.java:233)
	at org.apache.hadoop.mapred.TestJobCleanup.testCustomCleanup(TestJobCleanup.java:321)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestSpeculativeExecution.testSpeculativeExecution

Error Message:
Job status not available 

Stack Trace:
java.io.IOException: Job status not available 
	at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:331)
	at org.apache.hadoop.mapreduce.Job.isComplete(Job.java:604)
	at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1400)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1362)
	at org.apache.hadoop.mapreduce.v2.TestSpeculativeExecution.testSpeculativeExecution(TestSpeculativeExecution.java:249)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3247 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3247/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31594 lines...]
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.157 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.116 sec - in org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.215 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.493 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.253 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.937 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.754 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:22 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 34.523 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.781 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:03 min
[INFO] Finished at: 2016-04-26T13:26:15+00:00
[INFO] Final Memory: 43M/1247M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk - Build # 3246 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3246/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32134 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.854 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.501 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.587 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.105 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.848 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.235 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.208 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:26 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:51 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-26T10:42:18+00:00
[INFO] Final Memory: 35M/734M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3245 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3245/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32133 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.875 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.228 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.569 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.024 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.879 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.210 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.150 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:28 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-26T08:05:49+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3244 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3244/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32142 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.815 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.662 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.256 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.367 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.896 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.164 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.956 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:22 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:55 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:12 h
[INFO] Finished at: 2016-04-26T02:07:05+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3243 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3243/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32143 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.474 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.668 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.254 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.263 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.964 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.690 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.871 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:27 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:42 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-25T23:26:46+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3242 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3242/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32142 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.043 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.331 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.58 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.876 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.960 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.483 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.143 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:20 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-25T20:23:49+00:00
[INFO] Final Memory: 33M/597M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3241 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3241/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32133 lines...]
Running org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.087 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.88 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.319 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.606 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.904 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.775 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.077 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:32 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-04-25T17:05:40+00:00
[INFO] Final Memory: 34M/603M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3240 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3240/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31594 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.875 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.832 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.979 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.396 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.603 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.104 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.016 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.411 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.239 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:24 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:53 min
[INFO] Finished at: 2016-04-25T07:09:28+00:00
[INFO] Final Memory: 34M/701M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk - Build # 3239 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3239/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32035 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.862 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.56 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.596 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.132 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.875 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.320 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.168 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:28 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-23T18:02:57+00:00
[INFO] Final Memory: 33M/603M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3238 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3238/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32084 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.884 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.625 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.624 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.135 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests in error: 
  TestMapReduceLazyOutput.testLazyOutput:186 » NoClassDefFound org/apache/hadoop...

Tests run: 532, Failures: 4, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.168 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.726 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.190 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:17 h
[INFO] Finished at: 2016-04-23T09:15:18+00:00
[INFO] Final Memory: 39M/767M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapreduce.TestMapReduceLazyOutput.testLazyOutput

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapreduce.TestMapReduceLazyOutput.testLazyOutput(TestMapReduceLazyOutput.java:186)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3237 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3237/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31214 lines...]
Running org.apache.hadoop.mapreduce.tools.TestCLI
Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 1.439 sec <<< FAILURE! - in org.apache.hadoop.mapreduce.tools.TestCLI
testGetJob(org.apache.hadoop.mapreduce.tools.TestCLI)  Time elapsed: 0.066 sec  <<< FAILURE!
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)

Running org.apache.hadoop.mapreduce.TestTaskID
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.264 sec - in org.apache.hadoop.mapreduce.TestTaskID

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.076 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 min
[INFO] Finished at: 2016-04-23T01:50:28+00:00
[INFO] Final Memory: 42M/1292M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3236 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3236/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 37803 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.306 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.55 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.251 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests in error: 
  TestUberAM>TestMRJobs.testConfVerificationWithClassloader:310->TestMRJobs.testConfVerification:414 » 
  TestUberAM>TestMRJobs.tearDown:177 » NoClassDefFound org/apache/hadoop/service...

Tests run: 530, Failures: 6, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.870 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.304 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.180 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:58 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:15 h
[INFO] Finished at: 2016-04-23T00:51:19+00:00
[INFO] Final Memory: 34M/713M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
8 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testConfVerificationWithClassloader

Error Message:
test timed out after 300000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 300000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1404)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1362)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testConfVerification(TestMRJobs.java:414)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testConfVerificationWithClassloader(TestMRJobs.java:310)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.org.apache.hadoop.mapreduce.v2.TestUberAM

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.tearDown(TestMRJobs.java:177)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.service.ServiceOperations
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.tearDown(TestMRJobs.java:177)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Hadoop-Mapreduce-trunk - Build # 3235 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3235/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31610 lines...]
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.992 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs
Running org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.906 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.002 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.134 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Running org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.223 sec - in org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Running org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.293 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager

Results :

Tests in error: 
  TestJobListCache.testAddExisting:39 »  test timed out after 1000 milliseconds

Tests run: 209, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.339 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:13 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 45.632 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.879 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:52 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [07:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:45 min
[INFO] Finished at: 2016-04-22T21:43:22+00:00
[INFO] Final Memory: 37M/860M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting

Error Message:
test timed out after 1000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 1000 milliseconds
	at java.lang.ClassLoader.findLoadedClass0(Native Method)
	at java.lang.ClassLoader.findLoadedClass(ClassLoader.java:1093)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:407)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:412)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.mockito.cglib.core.EmitUtils.append_string(EmitUtils.java:569)
	at org.mockito.cglib.core.KeyFactory$Generator.generateClass(KeyFactory.java:248)
	at org.mockito.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
	at org.mockito.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:217)
	at org.mockito.cglib.core.KeyFactory$Generator.create(KeyFactory.java:145)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:117)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:109)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:105)
	at org.mockito.cglib.proxy.Enhancer.<clinit>(Enhancer.java:70)
	at org.mockito.internal.creation.jmock.ClassImposterizer.createProxyClass(ClassImposterizer.java:68)
	at org.mockito.internal.creation.jmock.ClassImposterizer.imposterise(ClassImposterizer.java:50)
	at org.mockito.internal.util.MockUtil.createMock(MockUtil.java:54)
	at org.mockito.internal.MockitoCore.mock(MockitoCore.java:45)
	at org.mockito.Mockito.mock(Mockito.java:921)
	at org.mockito.Mockito.mock(Mockito.java:816)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting(TestJobListCache.java:39)



Hadoop-Mapreduce-trunk - Build # 3234 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3234/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32044 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.043 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.426 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.646 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.324 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.872 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.508 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.135 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:17 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-22T19:57:38+00:00
[INFO] Final Memory: 34M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3233 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3233/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31532 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.659 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.11 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.707 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.599 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.841 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.46 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.599 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:08 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.764 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.672 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:00 min
[INFO] Finished at: 2016-04-22T17:12:16+00:00
[INFO] Final Memory: 36M/835M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk - Build # 3232 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3232/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31532 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.809 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.644 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.904 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.363 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.585 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.11 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.916 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.125 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.171 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:15 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:43 min
[INFO] Finished at: 2016-04-22T11:09:31+00:00
[INFO] Final Memory: 34M/676M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk - Build # 3231 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3231/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31534 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.526 sec - in org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher
Running org.apache.hadoop.mapreduce.v2.app.TestFail
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.597 sec - in org.apache.hadoop.mapreduce.v2.app.TestFail
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.262 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.636 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.37 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.105 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.832 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.598 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.209 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:49 min
[INFO] Finished at: 2016-04-22T04:17:28+00:00
[INFO] Final Memory: 35M/880M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk - Build # 3230 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3230/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31532 lines...]
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.506 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.38 sec - in org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.629 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.603 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.416 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.107 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.252 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:12 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 46.585 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.507 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:50 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:06 min
[INFO] Finished at: 2016-04-22T01:23:33+00:00
[INFO] Final Memory: 35M/892M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk - Build # 3229 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3229/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31212 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.208 sec - in org.apache.hadoop.mapreduce.task.reduce.TestMerger
Running org.apache.hadoop.mapreduce.TestJobMonitorAndPrint
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.524 sec - in org.apache.hadoop.mapreduce.TestJobMonitorAndPrint
Running org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.732 sec - in org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Running org.apache.hadoop.mapreduce.TestShufflePlugin
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.588 sec - in org.apache.hadoop.mapreduce.TestShufflePlugin
Running org.apache.hadoop.mapreduce.TestContextFactory
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.481 sec - in org.apache.hadoop.mapreduce.TestContextFactory
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.893 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.119 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [02:00 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 min
[INFO] Finished at: 2016-04-21T22:51:52+00:00
[INFO] Final Memory: 35M/913M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3228 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3228/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31532 lines...]
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.249 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.483 sec - in org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.914 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.815 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.992 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.222 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.418 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:18 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 47.956 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.860 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [13:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:28 min
[INFO] Finished at: 2016-04-21T21:56:04+00:00
[INFO] Final Memory: 43M/1246M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk - Build # 3227 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3227/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31213 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.581 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat
Running org.apache.hadoop.mapreduce.lib.input.TestFileInputFormat
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.503 sec - in org.apache.hadoop.mapreduce.lib.input.TestFileInputFormat
Running org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.237 sec - in org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Running org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.659 sec - in org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Running org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.215 sec - in org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.902 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.861 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:56 min
[INFO] Finished at: 2016-04-21T20:25:43+00:00
[INFO] Final Memory: 30M/718M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3226 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3226/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32332 lines...]
Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests in error: 
  TestJobSysDirWithDFS.testWithDFS:132 » YarnRuntime org.apache.hadoop.yarn.weba...
  TestNetworkedJob.testGetJobStatus:77->createMiniClusterWithCapacityScheduler:384 » NoClassDefFound
  TestNetworkedJob.testJobQueueClient:301->createMiniClusterWithCapacityScheduler:384 » NoClassDefFound
  TestNetworkedJob.testGetNullCounters:61 » NoClassDefFound org/apache/hadoop/ya...
  TestNetworkedJob.testNetworkedJob:129->createMiniClusterWithCapacityScheduler:384 » NoClassDefFound
  TestReduceFetch>TestReduceFetchFromPartialMem.setUp:53 » NoClassDefFound org/a...
  TestReduceFetch>TestReduceFetchFromPartialMem.setUp:53 » NoClassDefFound org/a...
  TestReduceFetch>TestReduceFetchFromPartialMem.setUp:53 » NoClassDefFound org/a...

Tests run: 520, Failures: 4, Errors: 8, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.899 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.501 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.026 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:34 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:59 h
[INFO] Finished at: 2016-04-21T19:56:26+00:00
[INFO] Final Memory: 33M/603M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter1043616284853371470.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire5173779952329598017tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2022132049302902058090tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
12 tests failed.
FAILED:  org.apache.hadoop.mapred.TestJobSysDirWithDFS.testWithDFS

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:870)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1078)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1176)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:128)
	at org.apache.hadoop.mapred.TestJobSysDirWithDFS.testWithDFS(TestJobSysDirWithDFS.java:132)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testGetJobStatus

Error Message:
org/apache/hadoop/yarn/conf/YarnConfiguration

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/conf/YarnConfiguration
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestNetworkedJob.createMiniClusterWithCapacityScheduler(TestNetworkedJob.java:384)
	at org.apache.hadoop.mapred.TestNetworkedJob.testGetJobStatus(TestNetworkedJob.java:77)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testJobQueueClient

Error Message:
org/apache/hadoop/yarn/conf/YarnConfiguration

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/conf/YarnConfiguration
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestNetworkedJob.createMiniClusterWithCapacityScheduler(TestNetworkedJob.java:384)
	at org.apache.hadoop.mapred.TestNetworkedJob.testJobQueueClient(TestNetworkedJob.java:301)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testGetNullCounters

Error Message:
org/apache/hadoop/yarn/api/records/ReservationId

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/api/records/ReservationId
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
	at java.lang.Class.getDeclaredMethods(Class.java:1855)
	at org.mockito.cglib.core.ReflectUtils.addAllMethods(ReflectUtils.java:349)
	at org.mockito.cglib.proxy.Enhancer.getMethods(Enhancer.java:422)
	at org.mockito.cglib.proxy.Enhancer.generateClass(Enhancer.java:457)
	at org.mockito.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
	at org.mockito.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:217)
	at org.mockito.cglib.proxy.Enhancer.createHelper(Enhancer.java:378)
	at org.mockito.cglib.proxy.Enhancer.createClass(Enhancer.java:318)
	at org.mockito.internal.creation.jmock.ClassImposterizer.createProxyClass(ClassImposterizer.java:93)
	at org.mockito.internal.creation.jmock.ClassImposterizer.imposterise(ClassImposterizer.java:50)
	at org.mockito.internal.util.MockUtil.createMock(MockUtil.java:54)
	at org.mockito.internal.MockitoCore.mock(MockitoCore.java:45)
	at org.mockito.Mockito.mock(Mockito.java:921)
	at org.mockito.Mockito.mock(Mockito.java:816)
	at org.apache.hadoop.mapred.TestNetworkedJob.testGetNullCounters(TestNetworkedJob.java:61)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob

Error Message:
org/apache/hadoop/yarn/conf/YarnConfiguration

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/conf/YarnConfiguration
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestNetworkedJob.createMiniClusterWithCapacityScheduler(TestNetworkedJob.java:384)
	at org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob(TestNetworkedJob.java:129)


FAILED:  org.apache.hadoop.mapred.TestReduceFetch.testReduceFromMem

Error Message:
org/apache/hadoop/yarn/server/MiniYARNCluster

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/server/MiniYARNCluster
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestReduceFetchFromPartialMem.setUp(TestReduceFetchFromPartialMem.java:53)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.server.MiniYARNCluster
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestReduceFetchFromPartialMem.setUp(TestReduceFetchFromPartialMem.java:53)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestReduceFetch.testReduceFromDisk

Error Message:
org/apache/hadoop/mapreduce/v2/MiniMRYarnCluster

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/MiniMRYarnCluster
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestReduceFetchFromPartialMem.setUp(TestReduceFetchFromPartialMem.java:53)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestReduceFetch.testReduceFromPartialMem

Error Message:
org/apache/hadoop/mapreduce/v2/MiniMRYarnCluster

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/MiniMRYarnCluster
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestReduceFetchFromPartialMem.setUp(TestReduceFetchFromPartialMem.java:53)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3225 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3225/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32035 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.847 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 76.646 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.499 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.268 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.889 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.215 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.112 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:28 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-21T15:03:10+00:00
[INFO] Final Memory: 34M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3224 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3224/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32044 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.068 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.673 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.255 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.199 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.918 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.755 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.053 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:24 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:46 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-21T10:26:52+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3223 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3223/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31545 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.237 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.607 sec - in org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.401 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.65 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.653 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.165 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Tests in error: 
  TestContainerLauncher.testSlowNM:258 NoClassDefFound org/apache/hadoop/net/Net...

Tests run: 305, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.285 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:14 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 49.195 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.840 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:06 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16:27 min
[INFO] Finished at: 2016-04-21T07:37:40+00:00
[INFO] Final Memory: 34M/891M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefirebooter7129851810923234997.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire3297189750946946817tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire_918129327137013895370tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher.testSlowNM

Error Message:
org/apache/hadoop/net/NetUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/net/NetUtils
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher.testSlowNM(TestContainerLauncher.java:258)



Hadoop-Mapreduce-trunk - Build # 3222 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3222/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32041 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.002 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.76 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.337 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.665 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testMapRedExecutionEnv:452 Exception in testing propagation of env setting to child task
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.050 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:07 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.060 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.067 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-21T04:52:16+00:00
[INFO] Final Memory: 34M/615M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testMapRedExecutionEnv

Error Message:
Exception in testing propagation of env setting to child task

Stack Trace:
java.lang.AssertionError: Exception in testing propagation of env setting to child task
	at org.junit.Assert.fail(Assert.java:88)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testMapRedExecutionEnv(TestMiniMRChildTask.java:452)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3221 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3221/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32036 lines...]
Running org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 68.698 sec - in org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 59.865 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.007 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 159.98 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.666 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:03 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.761 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.542 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:39 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:04 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:55 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:15 h
[INFO] Finished at: 2016-04-21T01:24:50+00:00
[INFO] Final Memory: 34M/724M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3220 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3220/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31610 lines...]
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.12 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs
Running org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.931 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.165 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.573 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Running org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.79 sec - in org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Running org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.18 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager

Results :

Tests in error: 
  TestJobListCache.testAddExisting:39 »  test timed out after 1000 milliseconds

Tests run: 209, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.674 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:16 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 47.638 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.119 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:56 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [07:42 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:00 min
[INFO] Finished at: 2016-04-20T21:51:49+00:00
[INFO] Final Memory: 38M/1198M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting

Error Message:
test timed out after 1000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 1000 milliseconds
	at sun.misc.Unsafe.putOrderedObject(Native Method)
	at java.util.concurrent.ConcurrentHashMap.setEntryAt(ConcurrentHashMap.java:316)
	at java.util.concurrent.ConcurrentHashMap$Segment.put(ConcurrentHashMap.java:462)
	at java.util.concurrent.ConcurrentHashMap.putIfAbsent(ConcurrentHashMap.java:1150)
	at java.lang.ClassLoader.getClassLoadingLock(ClassLoader.java:464)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:405)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.mockito.cglib.core.AbstractClassGenerator.<init>(AbstractClassGenerator.java:39)
	at org.mockito.cglib.core.KeyFactory$Generator.<init>(KeyFactory.java:128)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:113)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:109)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:105)
	at org.mockito.cglib.proxy.Enhancer.<clinit>(Enhancer.java:70)
	at org.mockito.internal.creation.jmock.ClassImposterizer.createProxyClass(ClassImposterizer.java:68)
	at org.mockito.internal.creation.jmock.ClassImposterizer.imposterise(ClassImposterizer.java:50)
	at org.mockito.internal.util.MockUtil.createMock(MockUtil.java:54)
	at org.mockito.internal.MockitoCore.mock(MockitoCore.java:45)
	at org.mockito.Mockito.mock(Mockito.java:921)
	at org.mockito.Mockito.mock(Mockito.java:816)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting(TestJobListCache.java:39)



Hadoop-Mapreduce-trunk - Build # 3219 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3219/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32044 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.259 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.651 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.253 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.249 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.592 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:05 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 32.288 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.223 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:50 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:58 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-04-20T20:11:34+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3218 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3218/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32005 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.044 sec - in org.apache.hadoop.mapred.pipes.TestPipes
Running org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.663 sec - in org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Running org.apache.hadoop.mapred.pipes.TestPipeApplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.431 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.701 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.041 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.352 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 529, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.058 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.415 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.182 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:20 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:53 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-04-19T21:08:50+00:00
[INFO] Final Memory: 34M/597M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3217 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3217/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32019 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.043 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.279 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.381 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.822 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.106 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.088 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.222 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:36 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-19T15:38:27+00:00
[INFO] Final Memory: 34M/607M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3216 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3216/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32009 lines...]
Running org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.093 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.755 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.147 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.954 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.921 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.041 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.098 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-19T13:02:38+00:00
[INFO] Final Memory: 33M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3215 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3215/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31590 lines...]
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.139 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs
Running org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.093 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.866 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.283 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Running org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.993 sec - in org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Running org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.237 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager

Results :

Tests in error: 
  TestJobListCache.testAddExisting:39 »  test timed out after 1000 milliseconds

Tests run: 209, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.472 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:13 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 46.061 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.720 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [07:34 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:45 min
[INFO] Finished at: 2016-04-19T09:39:54+00:00
[INFO] Final Memory: 36M/692M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting

Error Message:
test timed out after 1000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 1000 milliseconds
	at java.util.zip.Inflater.inflate(Inflater.java:260)
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:152)
	at sun.misc.Resource.getBytes(Resource.java:124)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:444)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.mockito.asm.ClassWriter.<init>(ClassWriter.java:537)
	at org.mockito.cglib.core.DebuggingClassWriter.<init>(DebuggingClassWriter.java:47)
	at org.mockito.cglib.core.DefaultGeneratorStrategy.getClassWriter(DefaultGeneratorStrategy.java:30)
	at org.mockito.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:24)
	at org.mockito.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:217)
	at org.mockito.cglib.core.KeyFactory$Generator.create(KeyFactory.java:145)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:117)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:109)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:105)
	at org.mockito.cglib.proxy.Enhancer.<clinit>(Enhancer.java:70)
	at org.mockito.internal.creation.jmock.ClassImposterizer.createProxyClass(ClassImposterizer.java:68)
	at org.mockito.internal.creation.jmock.ClassImposterizer.imposterise(ClassImposterizer.java:50)
	at org.mockito.internal.util.MockUtil.createMock(MockUtil.java:54)
	at org.mockito.internal.MockitoCore.mock(MockitoCore.java:45)
	at org.mockito.Mockito.mock(Mockito.java:921)
	at org.mockito.Mockito.mock(Mockito.java:816)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting(TestJobListCache.java:39)



Hadoop-Mapreduce-trunk - Build # 3214 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3214/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32018 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.395 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.672 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.257 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.358 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.977 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.454 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.779 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-19T04:03:06+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3213 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3213/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32018 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.044 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.128 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.241 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.936 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.884 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.499 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.256 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-19T00:35:08+00:00
[INFO] Final Memory: 33M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3212 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3212/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31589 lines...]
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.664 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesJobs
Running org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.915 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.248 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 46.121 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Running org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.645 sec - in org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Running org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.522 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager

Results :

Tests in error: 
  TestJobListCache.testAddExisting:39 »  test timed out after 1000 milliseconds

Tests run: 209, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.593 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:09 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 48.005 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.399 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [13:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [08:26 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:58 min
[INFO] Finished at: 2016-04-18T20:54:05+00:00
[INFO] Final Memory: 36M/693M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting

Error Message:
test timed out after 1000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 1000 milliseconds
	at java.lang.Thread.currentThread(Native Method)
	at java.util.concurrent.locks.ReentrantLock$Sync.tryRelease(ReentrantLock.java:154)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.release(AbstractQueuedSynchronizer.java:1260)
	at java.util.concurrent.locks.ReentrantLock.unlock(ReentrantLock.java:460)
	at java.util.concurrent.ConcurrentHashMap$Segment.put(ConcurrentHashMap.java:470)
	at java.util.concurrent.ConcurrentHashMap.putIfAbsent(ConcurrentHashMap.java:1150)
	at java.lang.ClassLoader.getClassLoadingLock(ClassLoader.java:464)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:405)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.mockito.cglib.core.AbstractClassGenerator.getClassName(AbstractClassGenerator.java:73)
	at org.mockito.cglib.core.AbstractClassGenerator.getClassName(AbstractClassGenerator.java:67)
	at org.mockito.cglib.core.KeyFactory$Generator.generateClass(KeyFactory.java:173)
	at org.mockito.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
	at org.mockito.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:217)
	at org.mockito.cglib.core.KeyFactory$Generator.create(KeyFactory.java:145)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:117)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:109)
	at org.mockito.cglib.core.KeyFactory.create(KeyFactory.java:105)
	at org.mockito.cglib.proxy.Enhancer.<clinit>(Enhancer.java:70)
	at org.mockito.internal.creation.jmock.ClassImposterizer.createProxyClass(ClassImposterizer.java:68)
	at org.mockito.internal.creation.jmock.ClassImposterizer.imposterise(ClassImposterizer.java:50)
	at org.mockito.internal.util.MockUtil.createMock(MockUtil.java:54)
	at org.mockito.internal.MockitoCore.mock(MockitoCore.java:45)
	at org.mockito.Mockito.mock(Mockito.java:921)
	at org.mockito.Mockito.mock(Mockito.java:816)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting(TestJobListCache.java:39)



Hadoop-Mapreduce-trunk - Build # 3211 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3211/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32001 lines...]
Running org.apache.hadoop.mapred.pipes.TestPipeApplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.831 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.624 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.891 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.454 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.865 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.520 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.194 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-04-18T16:16:30+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3210 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3210/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31460 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.472 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.733 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.444 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.76 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.261 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.136 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.291 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:07 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 33.179 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.550 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:20 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:13 min
[INFO] Finished at: 2016-04-18T03:14:00+00:00
[INFO] Final Memory: 34M/661M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)



Hadoop-Mapreduce-trunk - Build # 3209 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3209/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32001 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.042 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.479 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.736 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.132 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.933 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.553 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.160 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-17T09:05:29+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3208 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3208/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31178 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.428 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.388 sec - in org.apache.hadoop.mapred.TestMapFileOutputFormat
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.27 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.999 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.089 sec - in org.apache.hadoop.mapred.TestClock
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.837 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 min
[INFO] Finished at: 2016-04-17T03:58:42+00:00
[INFO] Final Memory: 30M/723M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3207 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3207/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32001 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.862 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.829 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.153 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.154 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.955 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.140 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.085 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:20 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-04-17T01:05:58+00:00
[INFO] Final Memory: 33M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3206 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3206/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 33223 lines...]
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer

Tests run: 344, Failures: 0, Errors: 77, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.900 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:48 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.561 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.077 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [08:35 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:01 min
[INFO] Finished at: 2016-04-16T01:17:46+00:00
[INFO] Final Memory: 34M/678M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
77 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfo

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAM

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidAccept

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodesXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodes

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXMLCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConf

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobs

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testJobTaskCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryReduce

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasks

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryMap

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)



Hadoop-Mapreduce-trunk - Build # 3205 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3205/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31470 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.401 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.774 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.672 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.963 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.993 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.647 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:11 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 32.704 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.292 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:20 min
[INFO] Finished at: 2016-04-15T22:29:55+00:00
[INFO] Final Memory: 36M/868M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)



Hadoop-Mapreduce-trunk - Build # 3204 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3204/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31465 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.349 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.874 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.396 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.426 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.717 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.201 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<RUNNING>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.322 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:01 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.106 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.432 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:47 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:28 min
[INFO] Finished at: 2016-04-15T20:40:40+00:00
[INFO] Final Memory: 36M/868M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<RUNNING>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<RUNNING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)



Hadoop-Mapreduce-trunk - Build # 3203 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3203/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32005 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.961 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 75.953 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.192 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.438 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.842 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.757 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.095 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:20 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-15T19:58:41+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3202 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3202/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31869 lines...]
Running org.apache.hadoop.mapred.TestInputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.376 sec - in org.apache.hadoop.mapred.TestInputPath
Running org.apache.hadoop.mapred.TestLineRecordReaderJobs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.562 sec - in org.apache.hadoop.mapred.TestLineRecordReaderJobs
Running org.apache.hadoop.mapred.TestMRIntermediateDataEncryption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 256.762 sec - in org.apache.hadoop.mapred.TestMRIntermediateDataEncryption
Running org.apache.hadoop.mapred.TestAuditLogger
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.293 sec - in org.apache.hadoop.mapred.TestAuditLogger
Running org.apache.hadoop.mapred.TestYARNRunner

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 326, Failures: 4, Errors: 0, Skipped: 6

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.894 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.083 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.076 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:14 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:06 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:24 h
[INFO] Finished at: 2016-04-15T07:22:05+00:00
[INFO] Final Memory: 38M/638M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7201134271799159084.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire2978371516253715184tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1416064574178948746161tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3201 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3201/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31468 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.723 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestBlocks
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.311 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.892 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.323 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.41 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.281 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.589 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:42 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 40.648 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 10.609 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:48 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:29 min
[INFO] Finished at: 2016-04-14T22:38:30+00:00
[INFO] Final Memory: 43M/1247M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk - Build # 3200 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3200/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32001 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.063 sec - in org.apache.hadoop.mapred.pipes.TestPipes
Running org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.214 sec - in org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Running org.apache.hadoop.mapred.pipes.TestPipeApplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.803 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.462 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.12 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.385 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 531, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.153 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.133 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.191 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:24 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:59 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:17 h
[INFO] Finished at: 2016-04-14T21:15:14+00:00
[INFO] Final Memory: 37M/748M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3199 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3199/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32010 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.043 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.405 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.47 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.301 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.809 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.391 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.112 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-14T15:34:55+00:00
[INFO] Final Memory: 34M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3198 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3198/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32010 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.043 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.336 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.673 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.972 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.863 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:13 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 32.944 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.009 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:13 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-14T13:07:19+00:00
[INFO] Final Memory: 34M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3197 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3197/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31462 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.979 sec - in org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher
Running org.apache.hadoop.mapreduce.v2.app.TestFail
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.229 sec - in org.apache.hadoop.mapreduce.v2.app.TestFail
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.634 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.733 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.453 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.25 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.425 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:06 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 32.170 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.905 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:50 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:40 min
[INFO] Finished at: 2016-04-14T03:20:54+00:00
[INFO] Final Memory: 34M/675M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk - Build # 3196 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3196/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31178 lines...]
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.232 sec - in org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Running org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.106 sec - in org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.266 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Running org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.727 sec - in org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Running org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.035 sec - in org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.852 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.025 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [03:14 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:22 min
[INFO] Finished at: 2016-04-14T01:44:12+00:00
[INFO] Final Memory: 30M/718M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3195 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3195/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 36706 lines...]
  +-org.apache.hadoop:hadoop-common:3.0.0-20160414.000344-7603
    +-org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-annotations:3.0.0-20160414.000315-8514
and
+-org.apache.hadoop:hadoop-mapreduce-client:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-common:3.0.0-20160414.000344-7603
    +-org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT

[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT paths to dependency are:
+-org.apache.hadoop:hadoop-mapreduce-client:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-common:3.0.0-20160414.000344-7603
    +-org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-annotations:3.0.0-20160414.000315-8514
and
+-org.apache.hadoop:hadoop-mapreduce-client:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-common:3.0.0-20160414.000344-7603
    +-org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT
]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... FAILURE [ 10.871 s]
[INFO] Apache Hadoop MapReduce Core ...................... SKIPPED
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12.207 s
[INFO] Finished at: 2016-04-14T00:05:13+00:00
[INFO] Final Memory: 26M/723M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk - Build # 3194 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3194/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31997 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.633 sec - in org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.773 sec - in org.apache.hadoop.mapred.TestReporter
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.924 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.739 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.487 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.066 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 529, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.860 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.605 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.123 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:15 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:54 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-04-13T21:58:01+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3193 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3193/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32010 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.341 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.737 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.301 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.795 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.802 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:24 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.573 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.381 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:11 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:03 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-13T19:16:23+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3192 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3192/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 36587 lines...]
  TestMRApp.testZeroMapReduces:93 » YarnRuntime Error creating staging dir
  TestMRApp.testCountersOnJobFinish:530 » YarnRuntime Error creating staging dir
  TestMRApp.testZeroMaps:85 » YarnRuntime Error creating staging dir
  TestMRApp.testMapReduce:76 » YarnRuntime Error creating staging dir
  TestMRApp.testJobRebootOnLastRetryOnUnregistrationFailure:492 » YarnRuntime Er...
  TestFetchFailure.testFetchFailureWithRecovery:182 » YarnRuntime Error creating...
  TestFetchFailure.testFetchFailureMultipleReduces:275 » YarnRuntime Error creat...
  TestFetchFailure.testFetchFailure:61 » YarnRuntime Error creating staging dir
  TestRMContainerAllocator.testReportedAppProgress:935 » YarnRuntime Error creat...
  TestRMContainerAllocator.testReportedAppProgressWithOnlyMaps:1087 » YarnRuntime
  TestRMContainerAllocator.testUnregistrationOnlyIfRegistered:2404 » YarnRuntime
  TestKill.testKillTaskAttempt:363 » YarnRuntime Error creating staging dir
  TestKill.testKillTaskWaitKillJobAfterTA_DONE:227 » YarnRuntime Error creating ...
  TestKill.testKillJob:66 » YarnRuntime Error creating staging dir
  TestKill.testKillTaskWaitKillJobBeforeTA_DONE:275 » YarnRuntime Error creating...
  TestKill.testKillTask:100 » YarnRuntime Error creating staging dir
  TestKill.testKillTaskWait:189 » YarnRuntime Error creating staging dir

Tests run: 327, Failures: 3, Errors: 95, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.050 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.026 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.314 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [07:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 09:39 min
[INFO] Finished at: 2016-04-13T11:38:23+00:00
[INFO] Final Memory: 34M/661M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefirebooter2340891731251049200.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire5255151943858757869tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire_64800827372113722323tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
98 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestAMInfos.testAMInfosWithoutRecoveryEnabled

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestAMInfos.testAMInfosWithoutRecoveryEnabled(TestAMInfos.java:48)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFail.testMapFailureMaxPercent

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFail.testMapFailureMaxPercent(TestFail.java:99)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFail.testTimedOutTask

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFail.testTimedOutTask(TestFail.java:160)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFail.testFailTask

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFail.testFailTask(TestFail.java:69)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFail.testTaskFailWithUnusedContainer

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFail.testTaskFailWithUnusedContainer(TestFail.java:186)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFail.testReduceFailureMaxPercent

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFail.testReduceFailureMaxPercent(TestFail.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFetchFailure.testFetchFailureWithRecovery

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFetchFailure.testFetchFailureWithRecovery(TestFetchFailure.java:182)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFetchFailure.testFetchFailureMultipleReduces

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFetchFailure.testFetchFailureMultipleReduces(TestFetchFailure.java:275)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestFetchFailure.testFetchFailure

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestFetchFailure.testFetchFailure(TestFetchFailure.java:61)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier.testNotificationOnLastRetryNormalShutdown

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$serviceInit$24(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.serviceInit(<generated>)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$init$66(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.init(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$submit$1(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.submit(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$submit$2(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.submit(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier.testNotificationOnLastRetryNormalShutdown(TestJobEndNotifier.java:210)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier.testAbsentNotificationOnNotLastRetryUnregistrationFailure

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$serviceInit$24(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.serviceInit(<generated>)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$init$66(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.init(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$submit$1(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.submit(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$submit$2(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.submit(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier.testAbsentNotificationOnNotLastRetryUnregistrationFailure(TestJobEndNotifier.java:233)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier.testNotificationOnLastRetryUnregistrationFailure

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$serviceInit$24(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.serviceInit(<generated>)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$init$66(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.init(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$submit$1(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.submit(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.CGLIB$submit$2(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d$$FastClassByMockitoWithCGLIB$$4c1ee690.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier$MRAppWithCustomContainerAllocator$$EnhancerByMockitoWithCGLIB$$5d61c06d.submit(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.TestJobEndNotifier.testNotificationOnLastRetryUnregistrationFailure(TestJobEndNotifier.java:266)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskAttempt

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskAttempt(TestKill.java:363)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobAfterTA_DONE

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobAfterTA_DONE(TestKill.java:227)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:66)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobBeforeTA_DONE

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobBeforeTA_DONE(TestKill.java:275)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask(TestKill.java:100)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWait

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWait(TestKill.java:189)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testCommitPending

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testCommitPending(TestMRApp.java:100)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testContainerPassThrough

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testContainerPassThrough(TestMRApp.java:577)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobError

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobError(TestMRApp.java:434)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobSuccess

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobSuccess(TestMRApp.java:454)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testUpdatedNodes

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testUpdatedNodes(TestMRApp.java:211)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootNotLastRetryOnUnregistrationFailure

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootNotLastRetryOnUnregistrationFailure(TestMRApp.java:467)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMapReduces

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMapReduces(TestMRApp.java:93)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testCountersOnJobFinish

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testCountersOnJobFinish(TestMRApp.java:530)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMaps

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMaps(TestMRApp.java:85)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testMapReduce

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testMapReduce(TestMRApp.java:76)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootOnLastRetryOnUnregistrationFailure

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootOnLastRetryOnUnregistrationFailure(TestMRApp.java:492)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppComponentDependencies.testComponentStopOrder

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppComponentDependencies.testComponentStopOrder(TestMRAppComponentDependencies.java:46)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterCredentials

Error Message:
Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.create(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:294)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:242)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:415)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:412)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:412)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:355)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
	at org.apache.hadoop.security.Credentials.writeTokenStorageFile(Credentials.java:234)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterCredentials(TestMRAppMaster.java:443)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterJobLaunchTime

Error Message:
Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.create(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:294)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:242)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:415)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:412)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:412)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:355)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:587)
	at org.apache.hadoop.mapreduce.split.JobSplitWriter.createFile(JobSplitWriter.java:101)
	at org.apache.hadoop.mapreduce.split.JobSplitWriter.createSplitFiles(JobSplitWriter.java:90)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterJobLaunchTime(TestMRAppMaster.java:220)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterMaxAppAttempts

Error Message:
Error while initializing

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error while initializing
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:314)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest.serviceInit(TestMRAppMaster.java:590)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$6.run(MRAppMaster.java:1644)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1641)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterMaxAppAttempts(TestMRAppMaster.java:369)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterMidLock

Error Message:
Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.create(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:294)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:242)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:415)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:412)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:412)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:355)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterMidLock(TestMRAppMaster.java:171)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterSuccessLock

Error Message:
Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.create(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:294)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:242)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:415)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:412)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:412)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:355)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterSuccessLock(TestMRAppMaster.java:247)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterMissingStaging

Error Message:
Error while initializing

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error while initializing
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:314)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest.serviceInit(TestMRAppMaster.java:590)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$6.run(MRAppMaster.java:1644)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1641)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterMissingStaging(TestMRAppMaster.java:331)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterShutDownJob

Error Message:
Error while initializing

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error while initializing
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:314)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest.serviceInit(TestMRAppMaster.java:590)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest$$EnhancerByMockitoWithCGLIB$$7c33c602.CGLIB$serviceInit$2(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest$$EnhancerByMockitoWithCGLIB$$7c33c602$$FastClassByMockitoWithCGLIB$$14e3d10d.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest$$EnhancerByMockitoWithCGLIB$$7c33c602.serviceInit(<generated>)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest$$EnhancerByMockitoWithCGLIB$$7c33c602.CGLIB$init$63(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest$$EnhancerByMockitoWithCGLIB$$7c33c602$$FastClassByMockitoWithCGLIB$$14e3d10d.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMasterTest$$EnhancerByMockitoWithCGLIB$$7c33c602.init(<generated>)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$6.run(MRAppMaster.java:1644)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1641)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterShutDownJob(TestMRAppMaster.java:522)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterFailLock

Error Message:
Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy20.create(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:294)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy21.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:242)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:415)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:412)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:412)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:355)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppMaster.testMRAppMasterFailLock(TestMRAppMaster.java:285)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRClientService.test

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRClientService.test(TestMRClientService.java:82)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRClientService.testViewAclOnlyCannotModify

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestMRClientService.testViewAclOnlyCannotModify(TestMRClientService.java:206)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testMultipleCrashes

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testMultipleCrashes(TestRecovery.java:700)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryFailsUsingCustomOutputCommitter

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryFailsUsingCustomOutputCommitter(TestRecovery.java:580)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashOfMapsOnlyJob

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashOfMapsOnlyJob(TestRecovery.java:334)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecovery

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecovery(TestRecovery.java:828)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecoveryMapsOnly

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecoveryMapsOnly(TestRecovery.java:937)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoverySuccessUsingCustomOutputCommitter

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoverySuccessUsingCustomOutputCommitter(TestRecovery.java:467)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:136)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative(TestRecovery.java:1170)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithoutShuffleSecret

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithoutShuffleSecret(TestRecovery.java:1323)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithOldCommiter

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoveryWithOldCommiter(TestRecovery.java:1052)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testNoDeletionofStagingOnReboot

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.mkdir(JobHistoryEventHandler.java:288)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.serviceInit(JobHistoryEventHandler.java:173)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:491)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testNoDeletionofStagingOnReboot(TestStagingCleanup.java:157)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testDeletionofStagingOnUnregistrationFailure

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.mkdir(JobHistoryEventHandler.java:288)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.serviceInit(JobHistoryEventHandler.java:173)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:491)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testDeletionofStagingOnUnregistrationFailure(TestStagingCleanup.java:103)
	at org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testDeletionofStagingOnUnregistrationFailure(TestStagingCleanup.java:82)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testStagingCleanupOrder

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testStagingCleanupOrder(TestStagingCleanup.java:469)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testDeletionofStagingOnKill

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.mkdir(JobHistoryEventHandler.java:288)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.serviceInit(JobHistoryEventHandler.java:173)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:491)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testDeletionofStagingOnKill(TestStagingCleanup.java:209)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testDeletionofStaging

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.mkdir(JobHistoryEventHandler.java:288)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryEventHandler.serviceInit(JobHistoryEventHandler.java:173)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:491)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.TestStagingCleanup.testDeletionofStaging(TestStagingCleanup.java:134)


FAILED:  org.apache.hadoop.mapreduce.v2.app.commit.TestCommitterEventHandler.testFailure

Error Message:
Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy19.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy20.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1433)
	at org.apache.hadoop.mapreduce.v2.app.commit.TestCommitterEventHandler.testFailure(TestCommitterEventHandler.java:322)


FAILED:  org.apache.hadoop.mapreduce.v2.app.commit.TestCommitterEventHandler.testBasic

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.app.commit.TestCommitterEventHandler.testBasic(TestCommitterEventHandler.java:269)


FAILED:  org.apache.hadoop.mapreduce.v2.app.commit.TestCommitterEventHandler.testCommitWindow

Error Message:
committer did not register a heartbeat callback expected:<1> but was:<0>

Stack Trace:
java.lang.AssertionError: committer did not register a heartbeat callback expected:<1> but was:<0>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapreduce.v2.app.commit.TestCommitterEventHandler.testCommitWindow(TestCommitterEventHandler.java:155)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testCheckJobCompleteSuccess

Error Message:
test timed out after 20000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 20000 milliseconds
	at sun.misc.Unsafe.park(Native Method)
	at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2043)
	at java.util.concurrent.CyclicBarrier.dowait(CyclicBarrier.java:227)
	at java.util.concurrent.CyclicBarrier.await(CyclicBarrier.java:355)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testCheckJobCompleteSuccess(TestJobImpl.java:222)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testJobNoTasks

Error Message:
expected:<SUCCEEDED> but was:<FAILED>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<FAILED>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testJobNoTasks(TestJobImpl.java:168)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testRebootedDuringCommit

Error Message:
test timed out after 20000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 20000 milliseconds
	at sun.misc.Unsafe.park(Native Method)
	at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2043)
	at java.util.concurrent.CyclicBarrier.dowait(CyclicBarrier.java:227)
	at java.util.concurrent.CyclicBarrier.await(CyclicBarrier.java:355)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testRebootedDuringCommit(TestJobImpl.java:302)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testCommitJobFailsJob

Error Message:
test timed out after 20000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 20000 milliseconds
	at sun.misc.Unsafe.park(Native Method)
	at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2043)
	at java.util.concurrent.CyclicBarrier.dowait(CyclicBarrier.java:227)
	at java.util.concurrent.CyclicBarrier.await(CyclicBarrier.java:355)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testCommitJobFailsJob(TestJobImpl.java:197)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
test timed out after 20000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 20000 milliseconds
	at sun.misc.Unsafe.park(Native Method)
	at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2043)
	at java.util.concurrent.CyclicBarrier.dowait(CyclicBarrier.java:227)
	at java.util.concurrent.CyclicBarrier.await(CyclicBarrier.java:355)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:628)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testKilledDuringCommit

Error Message:
test timed out after 20000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 20000 milliseconds
	at sun.misc.Unsafe.park(Native Method)
	at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2043)
	at java.util.concurrent.CyclicBarrier.dowait(CyclicBarrier.java:227)
	at java.util.concurrent.CyclicBarrier.await(CyclicBarrier.java:355)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testKilledDuringCommit(TestJobImpl.java:369)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLine

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLine(TestMapReduceChildJVM.java:54)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateShuffle

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:110)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateShuffle(TestMapReduceChildJVM.java:87)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLineWithLog4JConifg

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLineWithLog4JConifg(TestMapReduceChildJVM.java:159)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:110)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:102)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSizes

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSize(TestMapReduceChildJVM.java:228)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSizes(TestMapReduceChildJVM.java:183)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testEnvironmentVariables

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testEnvironmentVariables(TestMapReduceChildJVM.java:286)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateCRLAShuffle

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:110)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateCRLAShuffle(TestMapReduceChildJVM.java:96)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestShuffleProvider.testShuffleProviders

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestShuffleProvider.testShuffleProviders(TestShuffleProvider.java:106)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerKillAfterAssigned

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerKillAfterAssigned(TestTaskAttempt.java:1029)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedFailHistory(TestTaskAttempt.java:381)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:132)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForReduce

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistory(TestTaskAttempt.java:355)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForReduce(TestTaskAttempt.java:123)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerKillWhileCommitPending

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerKillWhileCommitPending(TestTaskAttempt.java:1140)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testKillMapTaskWhileSuccessFinishing

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.createTaskAttemptImpl(TestTaskAttempt.java:1401)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testKillMapTaskWhileSuccessFinishing(TestTaskAttempt.java:1162)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testKillMapTaskWhileFailFinishing

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.createTaskAttemptImpl(TestTaskAttempt.java:1401)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testKillMapTaskWhileFailFinishing(TestTaskAttempt.java:1201)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerCleanedWhileRunning

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerCleanedWhileRunning(TestTaskAttempt.java:594)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTooManyFetchFailureAfterKill

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTooManyFetchFailureAfterKill(TestTaskAttempt.java:843)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerKillWhileRunning

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerKillWhileRunning(TestTaskAttempt.java:1083)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTimeoutWhileSuccessFinishing

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.createTaskAttemptImpl(TestTaskAttempt.java:1401)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTimeoutWhileSuccessFinishing(TestTaskAttempt.java:1301)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testFailMapTaskByClient

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.createTaskAttemptImpl(TestTaskAttempt.java:1401)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testFailMapTaskByClient(TestTaskAttempt.java:1246)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTimeoutWhileFailFinishing

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.createTaskAttemptImpl(TestTaskAttempt.java:1401)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTimeoutWhileFailFinishing(TestTaskAttempt.java:1328)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptDiagnosticEventOnFinishing

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.createTaskAttemptImpl(TestTaskAttempt.java:1401)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptDiagnosticEventOnFinishing(TestTaskAttempt.java:1275)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMillisCountersUpdate

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.verifyMillisCounters(TestTaskAttempt.java:280)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMillisCountersUpdate(TestTaskAttempt.java:262)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testFetchFailureAttemptFinishTime

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testFetchFailureAttemptFinishTime(TestTaskAttempt.java:959)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerCleanedWhileCommitting

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testContainerCleanedWhileCommitting(TestTaskAttempt.java:652)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testDoubleTooManyFetchFailure

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testDoubleTooManyFetchFailure(TestTaskAttempt.java:716)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForMap

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistory(TestTaskAttempt.java:355)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForMap(TestTaskAttempt.java:117)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testLaunchFailedWhileKilling

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1691)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl$ContainerAssignedTransition.transition(TaskAttemptImpl.java:1668)
	at org.apache.hadoop.yarn.state.StateMachineFactory$SingleInternalArc.doTransition(StateMachineFactory.java:362)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.handle(TaskAttemptImpl.java:1190)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testLaunchFailedWhileKilling(TestTaskAttempt.java:536)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttemptContainerRequest.testAttemptContainerRequest

Error Message:
java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.ConnectException: Call From asf904.gq1.ygridcore.net/67.195.81.148 to localhost:37852 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy17.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:751)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1630)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1324)
	at org.apache.hadoop.hdfs.DistributedFileSystem$26.doCall(DistributedFileSystem.java:1321)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1321)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createLocalResource(TaskAttemptImpl.java:713)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createCommonContainerLaunchContext(TaskAttemptImpl.java:799)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl.createContainerLaunchContext(TaskAttemptImpl.java:927)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttemptContainerRequest.testAttemptContainerRequest(TestTaskAttemptContainerRequest.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher.testSlowNM

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy18.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy19.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher.testSlowNM(TestContainerLauncher.java:274)


FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testReportedAppProgress

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy19.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy20.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testReportedAppProgress(TestRMContainerAllocator.java:935)


FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testReportedAppProgressWithOnlyMaps

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy19.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy20.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testReportedAppProgressWithOnlyMaps(TestRMContainerAllocator.java:1087)


FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testUnregistrationOnlyIfRegistered

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy19.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy20.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testUnregistrationOnlyIfRegistered(TestRMContainerAllocator.java:2404)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp.testMRWebAppSSLDisabled

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy22.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy23.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp.testMRWebAppSSLDisabled(TestAMWebApp.java:184)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp.testMRWebAppRedirection

Error Message:
Error creating staging dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Error creating staging dir
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:682)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:778)
	at org.apache.hadoop.ipc.Client$Connection.access$3100(Client.java:413)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1503)
	at org.apache.hadoop.ipc.Client.call(Client.java:1373)
	at org.apache.hadoop.ipc.Client.call(Client.java:1351)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy22.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy23.mkdirs(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2302)
	at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2277)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
	at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
	at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.serviceInit(MRApp.java:267)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:300)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp.testMRWebAppRedirection(TestAMWebApp.java:241)



Hadoop-Mapreduce-trunk - Build # 3191 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3191/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32010 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.857 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.897 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.533 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestTextInputFormat.testSplitableCodecs:223 Key in multiple partitions.
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.899 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.805 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.883 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:51 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-13T03:42:04+00:00
[INFO] Final Memory: 34M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestTextInputFormat.testSplitableCodecs

Error Message:
Key in multiple partitions.

Stack Trace:
java.lang.AssertionError: Key in multiple partitions.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertFalse(Assert.java:64)
	at org.apache.hadoop.mapred.TestTextInputFormat.testSplitableCodecs(TestTextInputFormat.java:223)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3190 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3190/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32089 lines...]
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.585 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests in error: 
  TestMapReduceLazyOutput.testLazyOutput:186 » NoClassDefFound org/apache/hadoop...
  TestMRCredentials.setUp:62 » NoClassDefFound org/apache/hadoop/hdfs/server/nam...
  TestEncryptedShuffle.encryptedShuffleWithClientCerts:167->encryptedShuffleWithCerts:138->startCluster:107 » NoClassDefFound
  TestEncryptedShuffle.encryptedShuffleWithoutClientCerts:172->encryptedShuffleWithCerts:138->startCluster:107 » NoClassDefFound

Tests run: 533, Failures: 4, Errors: 4, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.905 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.774 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.267 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:36 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-13T00:43:50+00:00
[INFO] Final Memory: 34M/748M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
8 tests failed.
FAILED:  org.apache.hadoop.mapreduce.TestMapReduceLazyOutput.testLazyOutput

Error Message:
org/apache/hadoop/util/ShutdownThreadsHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownThreadsHelper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:668)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:172)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:257)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapreduce.TestMapReduceLazyOutput.testLazyOutput(TestMapReduceLazyOutput.java:186)


FAILED:  org.apache.hadoop.mapreduce.security.TestMRCredentials.org.apache.hadoop.mapreduce.security.TestMRCredentials

Error Message:
org/apache/hadoop/hdfs/server/namenode/JournalSet$5

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/JournalSet$5
	at org.apache.hadoop.hdfs.server.namenode.JournalSet.close(JournalSet.java:243)
	at org.apache.hadoop.hdfs.server.namenode.FSEditLog.close(FSEditLog.java:375)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1202)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1558)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:790)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:969)
	at org.apache.hadoop.hdfs.MiniDFSCluster.stopAndJoinNameNode(MiniDFSCluster.java:1965)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1911)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1882)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1875)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:854)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapreduce.security.TestMRCredentials.setUp(TestMRCredentials.java:62)


FAILED:  org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithClientCerts

Error Message:
org/apache/hadoop/yarn/server/MiniYARNCluster

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/server/MiniYARNCluster
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.startCluster(TestEncryptedShuffle.java:107)
	at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithCerts(TestEncryptedShuffle.java:138)
	at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithClientCerts(TestEncryptedShuffle.java:167)


FAILED:  org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithoutClientCerts

Error Message:
org/apache/hadoop/mapreduce/v2/MiniMRYarnCluster

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/MiniMRYarnCluster
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.startCluster(TestEncryptedShuffle.java:107)
	at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithCerts(TestEncryptedShuffle.java:138)
	at org.apache.hadoop.mapreduce.security.ssl.TestEncryptedShuffle.encryptedShuffleWithoutClientCerts(TestEncryptedShuffle.java:172)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3189 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3189/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32113 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.219 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.473 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.33 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Tests in error: 
  TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled:172 » IO Jo...
  TestMiniMRProxyUser.setUp:88 » YarnRuntime java.lang.NoClassDefFoundError: org...
  TestJobOutputCommitter.setUp:64->HadoopTestCase.setUp:156 » YarnRuntime org.ap...
  TestJobOutputCommitter.tearDown:71 NullPointer
  TestJobOutputCommitter.setUp:64->HadoopTestCase.setUp:156 » YarnRuntime org.ap...
  TestJobOutputCommitter.tearDown:71 NullPointer
  TestJobOutputCommitter.setUp:64->HadoopTestCase.setUp:156 » YarnRuntime org.ap...
  TestJobOutputCommitter.tearDown:71 NullPointer

Tests run: 516, Failures: 0, Errors: 7, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.394 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:14 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 47.679 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.820 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [13:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:28 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:53 h
[INFO] Finished at: 2016-04-12T22:10:53+00:00
[INFO] Final Memory: 41M/1017M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter434301716189836819.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire1981900445191391168tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2681726835145962152621tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled

Error Message:
Job didn't finish in 30 seconds

Stack Trace:
java.io.IOException: Job didn't finish in 30 seconds
	at org.apache.hadoop.mapred.UtilsForTests.runJobSucceed(UtilsForTests.java:622)
	at org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled(TestMRTimelineEventHandling.java:172)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomCleanup

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:870)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1078)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1176)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.setUp(TestJobOutputCommitter.java:64)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomCleanup

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.tearDown(TestJobOutputCommitter.java:71)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testDefaultCleanupAndAbort

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:870)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1078)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1176)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.setUp(TestJobOutputCommitter.java:64)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testDefaultCleanupAndAbort

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.tearDown(TestJobOutputCommitter.java:71)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomAbort

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:870)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1078)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1176)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.setUp(TestJobOutputCommitter.java:64)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomAbort

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.tearDown(TestJobOutputCommitter.java:71)



Hadoop-Mapreduce-trunk - Build # 3188 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3188/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32010 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.044 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.32 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.565 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.87 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.793 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:48 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.335 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.129 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:15 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:34 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-12T16:23:15+00:00
[INFO] Final Memory: 34M/607M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3187 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3187/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31180 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.267 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.383 sec - in org.apache.hadoop.mapred.TestMapFileOutputFormat
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.294 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.161 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.088 sec - in org.apache.hadoop.mapred.TestClock
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.801 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 min
[INFO] Finished at: 2016-04-12T09:07:34+00:00
[INFO] Final Memory: 34M/899M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3186 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3186/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32001 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.838 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.695 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.642 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.117 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.833 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:56 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.408 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.245 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:26 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-12T08:39:09+00:00
[INFO] Final Memory: 34M/724M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3185 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3185/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 33223 lines...]
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobs.<init>:116->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer

Tests run: 340, Failures: 0, Errors: 77, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.693 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:47 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.649 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.965 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [08:35 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:00 min
[INFO] Finished at: 2016-04-11T20:20:58+00:00
[INFO] Final Memory: 34M/671M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
77 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfo

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAM

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidAccept

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodesXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodes

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXMLCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConf

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobs

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testJobTaskCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryReduce

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasks

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryMap

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)



Hadoop-Mapreduce-trunk - Build # 3184 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3184/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32001 lines...]
Running org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 69.309 sec - in org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.786 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.923 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 160.842 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.097 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:19 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 34.074 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.867 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:48 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:00 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:51 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-04-11T19:24:07+00:00
[INFO] Final Memory: 35M/1018M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3183 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3183/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32001 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.074 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.121 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.302 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.126 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.886 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.393 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.144 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:22 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-11T09:30:50+00:00
[INFO] Final Memory: 33M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3182 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3182/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32002 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.161 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.597 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.968 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.129 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.743 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.672 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.142 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-11T06:58:40+00:00
[INFO] Final Memory: 33M/603M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3181 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3181/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3780 lines...]
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.205 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.725 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.675 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.093 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.170 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:01 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.069 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.412 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-10T19:37:57+00:00
[INFO] Final Memory: 39M/689M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk - Build # 3180 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3180/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3817 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.216 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.541 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.792 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.855 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:59 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.064 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.422 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:22 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-09T22:00:30+00:00
[INFO] Final Memory: 39M/689M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3179 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3179/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3812 lines...]
Running org.apache.hadoop.mapred.TestMiniMRDFSCaching
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.044 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.249 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.532 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.759 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 529, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.486 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:04 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 33.758 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.676 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:27 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:44 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:56 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:14 h
[INFO] Finished at: 2016-04-09T19:48:44+00:00
[INFO] Final Memory: 41M/1028M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3178 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3178/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8217 lines...]
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-common:3.0.0-20160409.003029-5493
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT

[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for org.apache.hadoop:hadoop-yarn-common:3.0.0-20160409.003004-5499 paths to dependency are:
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-common:3.0.0-20160409.003004-5499
and
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-client:3.0.0-20160409.003039-3785
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-mapreduce-client-core:3.0.0-SNAPSHOT
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-common:3.0.0-20160409.003029-5493
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT
]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  9.768 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:32 min]
[INFO] Apache Hadoop MapReduce Common .................... FAILURE [ 40.846 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:25 min
[INFO] Finished at: 2016-04-09T10:38:54+00:00
[INFO] Final Memory: 47M/884M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client-common: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-common
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk - Build # 3177 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3177/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3817 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.356 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.478 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.43 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.835 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:00 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.324 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.598 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:36 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-09T02:45:38+00:00
[INFO] Final Memory: 39M/724M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3176 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3176/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 3807 lines...]
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 62.809 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.868 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 157.059 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.643 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:21 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.847 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.200 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:22 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:59 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:51 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-04-08T23:52:34+00:00
[INFO] Final Memory: 39M/965M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3175 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3175/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8150 lines...]
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-common:3.0.0-20160408.161435-5492
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT

[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for org.apache.hadoop:hadoop-yarn-common:3.0.0-20160408.161432-5498 paths to dependency are:
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-common:3.0.0-20160408.161432-5498
and
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-client:3.0.0-SNAPSHOT
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-mapreduce-client-core:3.0.0-SNAPSHOT
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-common:3.0.0-20160408.161435-5492
    +-org.apache.hadoop:hadoop-yarn-common:3.0.0-SNAPSHOT
]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  7.435 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:29 min]
[INFO] Apache Hadoop MapReduce Common .................... FAILURE [ 38.078 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:16 min
[INFO] Finished at: 2016-04-08T18:39:24+00:00
[INFO] Final Memory: 59M/1302M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client-common: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-common
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk - Build # 3174 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3174/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 6578 lines...]
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-20160408.161445-5167
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-20160408.161445-5167
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-tests:3.0.0-SNAPSHOT
    +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-SNAPSHOT
, 
Dependency convergence error for org.apache.hadoop:hadoop-yarn-server-nodemanager:3.0.0-20160408.161437-5179 paths to dependency are:
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-nodemanager:3.0.0-20160408.161437-5179
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-mapreduce-client-shuffle:3.0.0-SNAPSHOT
    +-org.apache.hadoop:hadoop-yarn-server-nodemanager:3.0.0-SNAPSHOT
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-tests:3.0.0-SNAPSHOT
    +-org.apache.hadoop:hadoop-yarn-server-nodemanager:3.0.0-SNAPSHOT
]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.794 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:04 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.450 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.400 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:06 min
[INFO] Finished at: 2016-04-08T16:47:00+00:00
[INFO] Final Memory: 58M/741M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client-app: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk - Build # 3173 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3173/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2931 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.805 sec - in org.apache.hadoop.mapred.TestTaskLog
Running org.apache.hadoop.mapred.TestTaskLogAppender
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.231 sec - in org.apache.hadoop.mapred.TestTaskLogAppender
Running org.apache.hadoop.mapred.TestJobInfo
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.699 sec - in org.apache.hadoop.mapred.TestJobInfo
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.446 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Running org.apache.hadoop.mapred.TestSkipBadRecords
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.986 sec - in org.apache.hadoop.mapred.TestSkipBadRecords
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.094 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.796 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [02:02 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 min
[INFO] Finished at: 2016-04-08T12:37:11+00:00
[INFO] Final Memory: 42M/898M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3172 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3172/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32028 lines...]
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.821 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.177 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.333 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.472 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:05 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.142 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.527 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:49 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:54 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-08T10:13:15+00:00
[INFO] Final Memory: 35M/748M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3171 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3171/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32028 lines...]
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.705 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.873 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.089 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.306 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:13 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 32.823 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.609 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:43 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:53 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:52 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-04-08T03:11:09+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3170 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3170/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32039 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.272 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.566 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.872 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.983 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.965 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.911 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:45 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:46 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-07T10:01:21+00:00
[INFO] Final Memory: 34M/713M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3169 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3169/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32037 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.231 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.556 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.177 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:551 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:474 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.865 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.689 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.113 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:42 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-07T07:10:46+00:00
[INFO] Final Memory: 33M/607M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:551)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:474)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3168 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3168/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31462 lines...]
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.199 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices
Running org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.435 sec - in org.apache.hadoop.mapreduce.v2.app.TestTaskHeartbeatHandler
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.618 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.619 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.881 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.402 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.513 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:14 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 44.518 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 10.429 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [13:17 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:34 min
[INFO] Finished at: 2016-04-07T03:50:53+00:00
[INFO] Final Memory: 36M/835M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)



Hadoop-Mapreduce-trunk - Build # 3167 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3167/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32033 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.052 sec - in org.apache.hadoop.mapred.TestBadRecords
Running org.apache.hadoop.mapred.TestMiniMRDFSCaching
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.052 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.864 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.579 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.722 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.754 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.531 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.124 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:28 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:35 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-07T02:20:32+00:00
[INFO] Final Memory: 35M/713M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3166 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3166/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31464 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.429 sec - in org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher
Running org.apache.hadoop.mapreduce.v2.app.TestFail
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.508 sec - in org.apache.hadoop.mapreduce.v2.app.TestFail
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.772 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.608 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.362 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.088 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.861 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:56 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.082 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.122 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:39 min
[INFO] Finished at: 2016-04-06T23:50:37+00:00
[INFO] Final Memory: 33M/693M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk - Build # 3165 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3165/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 37844 lines...]
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.697 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.029 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.664 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests in error: 
  TestUberAM>TestMRJobs.testConfVerificationWithClassloader:310->TestMRJobs.testConfVerification:414 » 
  TestUberAM>TestMRJobs.tearDown:177 » YarnRuntime java.lang.reflect.InvocationT...

Tests run: 534, Failures: 2, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.158 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.038 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.213 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:46 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-06T22:38:30+00:00
[INFO] Final Memory: 34M/713M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testConfVerificationWithClassloader

Error Message:
test timed out after 300000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 300000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1404)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1362)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testConfVerification(TestMRJobs.java:414)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testConfVerificationWithClassloader(TestMRJobs.java:310)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.org.apache.hadoop.mapreduce.v2.TestUberAM

Error Message:
java.lang.reflect.InvocationTargetException

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.reflect.InvocationTargetException
	at org.apache.hadoop.yarn.factories.impl.pb.RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:73)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.unRegisterNM(NodeStatusUpdaterImpl.java:267)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStop(NodeStatusUpdaterImpl.java:248)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:378)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$NodeManagerWrapper.serviceStop(MiniYARNCluster.java:590)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.tearDown(TestMRJobs.java:177)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.yarn.factories.impl.pb.RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:70)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.unRegisterNM(NodeStatusUpdaterImpl.java:267)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStop(NodeStatusUpdaterImpl.java:248)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:378)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$NodeManagerWrapper.serviceStop(MiniYARNCluster.java:590)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.tearDown(TestMRJobs.java:177)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/proto/YarnServerCommonServiceProtos$UnRegisterNodeManagerRequestProto$1
	at org.apache.hadoop.yarn.proto.YarnServerCommonServiceProtos$UnRegisterNodeManagerRequestProto.<clinit>(YarnServerCommonServiceProtos.java:4118)
	at org.apache.hadoop.yarn.server.api.protocolrecords.impl.pb.UnRegisterNodeManagerRequestPBImpl.<init>(UnRegisterNodeManagerRequestPBImpl.java:33)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.yarn.factories.impl.pb.RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:70)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.unRegisterNM(NodeStatusUpdaterImpl.java:267)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStop(NodeStatusUpdaterImpl.java:248)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:378)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$NodeManagerWrapper.serviceStop(MiniYARNCluster.java:590)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.tearDown(TestMRJobs.java:177)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.proto.YarnServerCommonServiceProtos$UnRegisterNodeManagerRequestProto$1
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at org.apache.hadoop.yarn.proto.YarnServerCommonServiceProtos$UnRegisterNodeManagerRequestProto.<clinit>(YarnServerCommonServiceProtos.java:4118)
	at org.apache.hadoop.yarn.server.api.protocolrecords.impl.pb.UnRegisterNodeManagerRequestPBImpl.<init>(UnRegisterNodeManagerRequestPBImpl.java:33)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.yarn.factories.impl.pb.RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:70)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.unRegisterNM(NodeStatusUpdaterImpl.java:267)
	at org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.serviceStop(NodeStatusUpdaterImpl.java:248)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.yarn.server.nodemanager.NodeManager.serviceStop(NodeManager.java:378)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$NodeManagerWrapper.serviceStop(MiniYARNCluster.java:590)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.tearDown(TestMRJobs.java:177)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3164 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3164/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.583 sec - in org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Running org.apache.hadoop.mapred.pipes.TestPipeApplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.795 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.65 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.508 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.762 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.852 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.548 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.193 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:41 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-06T19:32:37+00:00
[INFO] Final Memory: 33M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3163 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3163/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31178 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.295 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.389 sec - in org.apache.hadoop.mapred.TestMapFileOutputFormat
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.501 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.431 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.mapred.TestClock
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.099 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.053 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:58 min
[INFO] Finished at: 2016-04-06T08:59:18+00:00
[INFO] Final Memory: 35M/913M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3162 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3162/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31998 lines...]
Running org.apache.hadoop.mapred.TestReduceTask
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.546 sec - in org.apache.hadoop.mapred.TestReduceTask
Running org.apache.hadoop.mapred.TestBadRecords
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.044 sec - in org.apache.hadoop.mapred.TestBadRecords
Running org.apache.hadoop.mapred.TestMiniMRDFSCaching
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.043 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.319 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.606 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.37 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 529, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.063 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.775 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.233 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:45 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:53 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-04-06T03:54:54+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3161 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3161/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31988 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 192.274 sec - in org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers
Running org.apache.hadoop.mapred.pipes.TestPipes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.043 sec - in org.apache.hadoop.mapred.pipes.TestPipes
Running org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.507 sec - in org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Running org.apache.hadoop.mapred.pipes.TestPipeApplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.855 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.096 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.692 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.153 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 532, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.790 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.878 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.099 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:58 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:15 h
[INFO] Finished at: 2016-04-06T01:10:02+00:00
[INFO] Final Memory: 37M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3160 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3160/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 32002 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.047 sec - in org.apache.hadoop.mapred.TestBadRecords
Running org.apache.hadoop.mapred.TestMiniMRDFSCaching
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.051 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.717 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.836 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.659 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.639 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:48 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.290 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.990 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:20 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-05T22:27:19+00:00
[INFO] Final Memory: 34M/748M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3159 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3159/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapreduce.lib.output.TestMRCJCFileOutputCommitter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.657 sec - in org.apache.hadoop.mapreduce.lib.output.TestMRCJCFileOutputCommitter
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.29 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.67 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.256 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.458 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.941 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.027 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.221 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:28 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-05T19:28:45+00:00
[INFO] Final Memory: 33M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3158 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3158/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31996 lines...]
Running org.apache.hadoop.mapreduce.lib.chain.TestMapReduceChain
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.042 sec - in org.apache.hadoop.mapreduce.lib.chain.TestMapReduceChain
Running org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.027 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.832 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.012 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.591 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.522 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.697 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.064 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:31 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:48 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-05T17:00:23+00:00
[INFO] Final Memory: 34M/730M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3157 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3157/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.799 sec - in org.apache.hadoop.mapred.TestReporter
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.968 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.059 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.075 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.11 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.836 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.194 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.129 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-05T11:58:09+00:00
[INFO] Final Memory: 34M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3156 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3156/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31993 lines...]
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.657 sec - in org.apache.hadoop.mapred.TestReporter
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.883 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.806 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.654 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.066 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.016 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:04 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.154 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.498 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:54 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:49 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-05T06:39:50+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3155 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3155/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.88 sec - in org.apache.hadoop.mapred.TestReporter
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.942 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.265 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.902 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.335 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.202 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:04 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.443 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.548 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-04-05T04:05:39+00:00
[INFO] Final Memory: 34M/766M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3154 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3154/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31178 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.773 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.416 sec - in org.apache.hadoop.mapred.TestMapFileOutputFormat
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.53 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.465 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec - in org.apache.hadoop.mapred.TestClock
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.096 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.091 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:57 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 min
[INFO] Finished at: 2016-04-05T01:29:13+00:00
[INFO] Final Memory: 30M/718M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk - Build # 3153 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3153/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.734 sec - in org.apache.hadoop.mapred.TestReporter
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.893 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.613 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.779 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.086 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.956 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:51 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.941 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.113 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:35 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-05T00:58:54+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3152 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3152/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.733 sec - in org.apache.hadoop.mapred.TestReporter
Running org.apache.hadoop.mapred.TestClientRedirect
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.87 sec - in org.apache.hadoop.mapred.TestClientRedirect
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.756 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.696 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.094 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.870 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.996 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.207 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:34 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-04T21:40:03+00:00
[INFO] Final Memory: 34M/600M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk - Build # 3151 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/3151/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 31992 lines...]
Running org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.562 sec - in org.apache.hadoop.mapred.pipes.TestPipesNonJavaInputFormat
Running org.apache.hadoop.mapred.pipes.TestPipeApplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.769 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.545 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.669 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.414 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.836 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.351 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.191 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-04T19:11:26+00:00
[INFO] Final Memory: 34M/611M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)