You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by Apache Hudson Server <hu...@hudson.apache.org> on 2011/02/14 17:02:07 UTC

Hadoop-Mapreduce-trunk - Build # 595 - Still Failing

See https://hudson.apache.org/hudson/job/Hadoop-Mapreduce-trunk/595/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 211644 lines...]
    [junit]    0.85:96549
    [junit]    0.9:96658
    [junit]    0.95:96670
    [junit] Failed Reduce CDF --------
    [junit] 0: -9223372036854775808--9223372036854775807
    [junit] map attempts to success -- 0.6567164179104478, 0.3283582089552239, 0.014925373134328358, 
    [junit] ===============
    [junit] 2011-02-14 16:00:14,239 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000025 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,239 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000028 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,240 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000029 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,240 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000030 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,240 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000031 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,241 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000032 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,241 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000033 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,241 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000034 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,242 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000035 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,242 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000036 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,242 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000037 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,243 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000038 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,243 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000039 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,243 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000040 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,244 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000041 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,244 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000042 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,244 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000043 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,245 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000044 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,245 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000045 has nulll TaskStatus
    [junit] 2011-02-14 16:00:14,245 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000046 has nulll TaskStatus
    [junit] generated failed map runtime distribution
    [junit] 100000: 18592--18592
    [junit]    0.1:18592
    [junit]    0.5:18592
    [junit]    0.9:18592
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 2.976 sec
    [junit] Running org.apache.hadoop.util.TestReflectionUtils
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.362 sec
    [junit] Running org.apache.hadoop.util.TestRunJar
    [junit] Creating file/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/test/data/out
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.321 sec

checkfailure:
    [touch] Creating /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/test/testsfailed

run-test-mapred-all-withtestcaseonly:

run-test-mapred:

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build.xml:817: Tests failed!

Total time: 173 minutes 7 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION:  org.apache.hadoop.mapreduce.TestLocalRunner.testMultiMaps

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.




Hadoop-Mapreduce-trunk - Build # 599 - Still Failing

Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Mapreduce-trunk/599/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4044 lines...]
[ivy:resolve] 	commons-codec#commons-codec;1.3 by [commons-codec#commons-codec;1.4] in [common]
[ivy:resolve] 	commons-codec#commons-codec;${commons-codec.version} by [commons-codec#commons-codec;1.4] in [common]
[ivy:resolve] 	org.codehaus.jackson#jackson-mapper-asl;${jackson.version} by [org.codehaus.jackson#jackson-mapper-asl;1.4.2] in [common]
[ivy:resolve] 	org.codehaus.jackson#jackson-core-asl;${jackson.version} by [org.codehaus.jackson#jackson-core-asl;1.4.2] in [common]
[ivy:resolve] 	com.thoughtworks.paranamer#paranamer;${paranamer.version} by [com.thoughtworks.paranamer#paranamer;2.2] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   42  |   2   |   0   |   8   ||   34  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#raid [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	34 artifacts copied, 0 already retrieved (13238kB/49ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/ivy/ivysettings.xml

compile:
     [echo] contrib: raid
    [javac] Compiling 32 source files to /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/contrib/raid/classes
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/namenode/BlockPlacementPolicyRaid.java:50: org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicyRaid is not abstract and does not override abstract method chooseTarget(java.lang.String,int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,boolean,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long) in org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicy
    [javac] public class BlockPlacementPolicyRaid extends BlockPlacementPolicy {
    [javac]        ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/namenode/BlockPlacementPolicyRaid.java:109: chooseTarget(java.lang.String,int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long) in org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicyRaid cannot override chooseTarget(java.lang.String,int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long) in org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicy; overridden method is final
    [javac]   DatanodeDescriptor[] chooseTarget(String srcPath, int numOfReplicas,
    [javac]                        ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/namenode/BlockPlacementPolicyRaid.java:118: cannot find symbol
    [javac] symbol  : method chooseTarget(int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long)
    [javac] location: class org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicyDefault
    [javac]         defaultPolicy.chooseTarget(numOfReplicas, writer,
    [javac]                      ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 3 errors

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build.xml:432: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/build.xml:30: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/build-contrib.xml:193: Compile failed; see the compiler error output for details.

Total time: 2 minutes 12 seconds
+ RESULT=1
+ [ 1 != 0 ]
+ echo Build Failed: remaining tests not run
Build Failed: remaining tests not run
+ exit 1
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk - Build # 598 - Still Failing

Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Mapreduce-trunk/598/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4076 lines...]
[ivy:resolve] 	commons-codec#commons-codec;1.3 by [commons-codec#commons-codec;1.4] in [common]
[ivy:resolve] 	commons-codec#commons-codec;${commons-codec.version} by [commons-codec#commons-codec;1.4] in [common]
[ivy:resolve] 	org.codehaus.jackson#jackson-mapper-asl;${jackson.version} by [org.codehaus.jackson#jackson-mapper-asl;1.4.2] in [common]
[ivy:resolve] 	org.codehaus.jackson#jackson-core-asl;${jackson.version} by [org.codehaus.jackson#jackson-core-asl;1.4.2] in [common]
[ivy:resolve] 	com.thoughtworks.paranamer#paranamer;${paranamer.version} by [com.thoughtworks.paranamer#paranamer;2.2] in [common]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      common      |   42  |   2   |   0   |   8   ||   34  |   0   |
	---------------------------------------------------------------------

ivy-retrieve-common:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#raid [sync]
[ivy:retrieve] 	confs: [common]
[ivy:retrieve] 	34 artifacts copied, 0 already retrieved (13238kB/53ms)
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/ivy/ivysettings.xml

compile:
     [echo] contrib: raid
    [javac] Compiling 32 source files to /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/contrib/raid/classes
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/namenode/BlockPlacementPolicyRaid.java:50: org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicyRaid is not abstract and does not override abstract method chooseTarget(java.lang.String,int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,boolean,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long) in org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicy
    [javac] public class BlockPlacementPolicyRaid extends BlockPlacementPolicy {
    [javac]        ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/namenode/BlockPlacementPolicyRaid.java:109: chooseTarget(java.lang.String,int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long) in org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicyRaid cannot override chooseTarget(java.lang.String,int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long) in org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicy; overridden method is final
    [javac]   DatanodeDescriptor[] chooseTarget(String srcPath, int numOfReplicas,
    [javac]                        ^
    [javac] /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/raid/src/java/org/apache/hadoop/hdfs/server/namenode/BlockPlacementPolicyRaid.java:118: cannot find symbol
    [javac] symbol  : method chooseTarget(int,org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor,java.util.List<org.apache.hadoop.hdfs.server.namenode.DatanodeDescriptor>,java.util.HashMap<org.apache.hadoop.net.Node,org.apache.hadoop.net.Node>,long)
    [javac] location: class org.apache.hadoop.hdfs.server.namenode.BlockPlacementPolicyDefault
    [javac]         defaultPolicy.chooseTarget(numOfReplicas, writer,
    [javac]                      ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 3 errors

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build.xml:432: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/build.xml:30: The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/src/contrib/build-contrib.xml:193: Compile failed; see the compiler error output for details.

Total time: 2 minutes 26 seconds
+ RESULT=1
+ [ 1 != 0 ]
+ echo Build Failed: remaining tests not run
Build Failed: remaining tests not run
+ exit 1
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk - Build # 597 - Still Failing

Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Mapreduce-trunk/597/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 212900 lines...]
    [junit]    0.85:96549
    [junit]    0.9:96658
    [junit]    0.95:96670
    [junit] Failed Reduce CDF --------
    [junit] 0: -9223372036854775808--9223372036854775807
    [junit] map attempts to success -- 0.6567164179104478, 0.3283582089552239, 0.014925373134328358, 
    [junit] ===============
    [junit] 2011-02-16 15:57:06,739 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000025 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,739 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000028 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,740 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000029 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,740 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000030 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,740 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000031 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,741 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000032 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,741 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000033 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,741 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000034 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,742 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000035 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,742 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000036 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,742 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000037 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,743 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000038 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,743 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000039 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,743 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000040 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,744 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000041 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,744 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000042 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,744 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000043 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,745 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000044 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,745 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000045 has nulll TaskStatus
    [junit] 2011-02-16 15:57:06,746 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000046 has nulll TaskStatus
    [junit] generated failed map runtime distribution
    [junit] 100000: 18592--18592
    [junit]    0.1:18592
    [junit]    0.5:18592
    [junit]    0.9:18592
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 3.008 sec
    [junit] Running org.apache.hadoop.util.TestReflectionUtils
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.34 sec
    [junit] Running org.apache.hadoop.util.TestRunJar
    [junit] Creating file/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/test/data/out
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.294 sec

checkfailure:
    [touch] Creating /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/test/testsfailed

run-test-mapred-all-withtestcaseonly:

run-test-mapred:

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build.xml:817: Tests failed!

Total time: 169 minutes 33 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.TestLocalRunner.testMultiMaps

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.




Hadoop-Mapreduce-trunk - Build # 596 - Still Failing

Posted by Apache Hudson Server <hu...@hudson.apache.org>.
See https://hudson.apache.org/hudson/job/Hadoop-Mapreduce-trunk/596/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 212799 lines...]
    [junit]    0.85:96549
    [junit]    0.9:96658
    [junit]    0.95:96670
    [junit] Failed Reduce CDF --------
    [junit] 0: -9223372036854775808--9223372036854775807
    [junit] map attempts to success -- 0.6567164179104478, 0.3283582089552239, 0.014925373134328358, 
    [junit] ===============
    [junit] 2011-02-15 16:16:22,448 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000025 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,449 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000028 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,449 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000029 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,450 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000030 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,450 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000031 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,450 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000032 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,451 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000033 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,451 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000034 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,451 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000035 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,452 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000036 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,452 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000037 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,452 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000038 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,453 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000039 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,453 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000040 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,453 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000041 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,454 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000042 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,454 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000043 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,454 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000044 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,455 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000045 has nulll TaskStatus
    [junit] 2011-02-15 16:16:22,455 WARN  rumen.ZombieJob (ZombieJob.java:sanitizeLoggedTask(318)) - Task task_200904211745_0004_r_000046 has nulll TaskStatus
    [junit] generated failed map runtime distribution
    [junit] 100000: 18592--18592
    [junit]    0.1:18592
    [junit]    0.5:18592
    [junit]    0.9:18592
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 3.003 sec
    [junit] Running org.apache.hadoop.util.TestReflectionUtils
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.369 sec
    [junit] Running org.apache.hadoop.util.TestRunJar
    [junit] Creating file/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/test/data/out
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.293 sec

checkfailure:
    [touch] Creating /grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build/test/testsfailed

run-test-mapred-all-withtestcaseonly:

run-test-mapred:

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Mapreduce-trunk/trunk/build.xml:817: Tests failed!

Total time: 188 minutes 41 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Recording fingerprints
Archiving artifacts
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.TestLocalRunner.testMultiMaps

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.