You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2012/02/08 19:21:06 UTC

Build failed in Jenkins: Mahout-Examples-Cluster-Reuters-II #36

See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters-II/36/changes>

Changes:

[srowen] MAHOUT-946 (missed one file)

[srowen] MAHOUT-946 Don't ignore failed MR status

[srowen] MAHOUT-951 unroll recursion to avoid stack overflow

[srowen] MAHOUT-971 Use FileSystem.get(URI, Configuration) across the board to make it (more likely to) work with S3

[srowen] MAHOUT-963 additional change to "less than" for reversed sort, which is necessary in one test

[srowen] MAHOUT-965 Add ability to specify mapping collection

[srowen] MAHOUT-967 add SequenceFileFromMailArchive config

[srowen] MAHOUT-970 Centralize version numbers

[srowen] MAHOUT-963 faster sort of items/users

------------------------------------------
[...truncated 6643 lines...]
12/02/08 18:20:31 INFO mapred.Task: Task:attempt_local_0009_m_000000_0 is done. And is in the process of commiting
12/02/08 18:20:32 INFO mapred.JobClient:  map 0% reduce 0%
12/02/08 18:20:34 INFO mapred.LocalJobRunner: 
12/02/08 18:20:34 INFO mapred.Task: Task 'attempt_local_0009_m_000000_0' done.
12/02/08 18:20:34 INFO mapred.LocalJobRunner: 
12/02/08 18:20:34 INFO mapred.Merger: Merging 1 sorted segments
12/02/08 18:20:34 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 132 bytes
12/02/08 18:20:34 INFO mapred.LocalJobRunner: 
12/02/08 18:20:34 INFO mapred.Task: Task:attempt_local_0009_r_000000_0 is done. And is in the process of commiting
12/02/08 18:20:34 INFO mapred.LocalJobRunner: 
12/02/08 18:20:34 INFO mapred.Task: Task attempt_local_0009_r_000000_0 is allowed to commit now
12/02/08 18:20:34 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0009_r_000000_0' to /tmp/mahout-work-hudson/reuters-dirichlet/clusters-9
12/02/08 18:20:34 INFO mapred.JobClient:  map 100% reduce 0%
12/02/08 18:20:37 INFO mapred.LocalJobRunner: reduce > reduce
12/02/08 18:20:37 INFO mapred.Task: Task 'attempt_local_0009_r_000000_0' done.
12/02/08 18:20:37 INFO mapred.JobClient:  map 100% reduce 100%
12/02/08 18:20:37 INFO mapred.JobClient: Job complete: job_local_0009
12/02/08 18:20:37 INFO mapred.JobClient: Counters: 16
12/02/08 18:20:37 INFO mapred.JobClient:   File Output Format Counters 
12/02/08 18:20:37 INFO mapred.JobClient:     Bytes Written=3090
12/02/08 18:20:37 INFO mapred.JobClient:   FileSystemCounters
12/02/08 18:20:37 INFO mapred.JobClient:     FILE_BYTES_READ=424872566
12/02/08 18:20:37 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=428747472
12/02/08 18:20:37 INFO mapred.JobClient:   File Input Format Counters 
12/02/08 18:20:37 INFO mapred.JobClient:     Bytes Read=102
12/02/08 18:20:37 INFO mapred.JobClient:   Map-Reduce Framework
12/02/08 18:20:37 INFO mapred.JobClient:     Reduce input groups=20
12/02/08 18:20:37 INFO mapred.JobClient:     Map output materialized bytes=136
12/02/08 18:20:37 INFO mapred.JobClient:     Combine output records=0
12/02/08 18:20:37 INFO mapred.JobClient:     Map input records=0
12/02/08 18:20:37 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/02/08 18:20:37 INFO mapred.JobClient:     Reduce output records=20
12/02/08 18:20:37 INFO mapred.JobClient:     Spilled Records=40
12/02/08 18:20:37 INFO mapred.JobClient:     Map output bytes=90
12/02/08 18:20:37 INFO mapred.JobClient:     Combine input records=0
12/02/08 18:20:37 INFO mapred.JobClient:     Map output records=20
12/02/08 18:20:37 INFO mapred.JobClient:     SPLIT_RAW_BYTES=156
12/02/08 18:20:37 INFO mapred.JobClient:     Reduce input records=20
12/02/08 18:20:37 INFO dirichlet.DirichletDriver: Iteration 10
12/02/08 18:20:37 INFO input.FileInputFormat: Total input paths to process : 1
12/02/08 18:20:37 INFO mapred.JobClient: Running job: job_local_0010
12/02/08 18:20:37 INFO mapred.MapTask: io.sort.mb = 100
12/02/08 18:20:37 INFO mapred.MapTask: data buffer = 79691776/99614720
12/02/08 18:20:37 INFO mapred.MapTask: record buffer = 262144/327680
12/02/08 18:20:38 INFO mapred.MapTask: Starting flush of map output
12/02/08 18:20:38 INFO mapred.MapTask: Finished spill 0
12/02/08 18:20:38 INFO mapred.Task: Task:attempt_local_0010_m_000000_0 is done. And is in the process of commiting
12/02/08 18:20:38 INFO mapred.JobClient:  map 0% reduce 0%
12/02/08 18:20:40 INFO mapred.LocalJobRunner: 
12/02/08 18:20:40 INFO mapred.Task: Task 'attempt_local_0010_m_000000_0' done.
12/02/08 18:20:40 INFO mapred.LocalJobRunner: 
12/02/08 18:20:40 INFO mapred.Merger: Merging 1 sorted segments
12/02/08 18:20:40 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 132 bytes
12/02/08 18:20:40 INFO mapred.LocalJobRunner: 
12/02/08 18:20:40 INFO mapred.Task: Task:attempt_local_0010_r_000000_0 is done. And is in the process of commiting
12/02/08 18:20:40 INFO mapred.LocalJobRunner: 
12/02/08 18:20:40 INFO mapred.Task: Task attempt_local_0010_r_000000_0 is allowed to commit now
12/02/08 18:20:40 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0010_r_000000_0' to /tmp/mahout-work-hudson/reuters-dirichlet/clusters-10
12/02/08 18:20:40 INFO mapred.JobClient:  map 100% reduce 0%
12/02/08 18:20:43 INFO mapred.LocalJobRunner: reduce > reduce
12/02/08 18:20:43 INFO mapred.Task: Task 'attempt_local_0010_r_000000_0' done.
12/02/08 18:20:43 INFO mapred.JobClient:  map 100% reduce 100%
12/02/08 18:20:43 INFO mapred.JobClient: Job complete: job_local_0010
12/02/08 18:20:43 INFO mapred.JobClient: Counters: 16
12/02/08 18:20:43 INFO mapred.JobClient:   File Output Format Counters 
12/02/08 18:20:43 INFO mapred.JobClient:     Bytes Written=3090
12/02/08 18:20:43 INFO mapred.JobClient:   FileSystemCounters
12/02/08 18:20:43 INFO mapred.JobClient:     FILE_BYTES_READ=472079978
12/02/08 18:20:43 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=476385250
12/02/08 18:20:43 INFO mapred.JobClient:   File Input Format Counters 
12/02/08 18:20:43 INFO mapred.JobClient:     Bytes Read=102
12/02/08 18:20:43 INFO mapred.JobClient:   Map-Reduce Framework
12/02/08 18:20:43 INFO mapred.JobClient:     Reduce input groups=20
12/02/08 18:20:43 INFO mapred.JobClient:     Map output materialized bytes=136
12/02/08 18:20:43 INFO mapred.JobClient:     Combine output records=0
12/02/08 18:20:43 INFO mapred.JobClient:     Map input records=0
12/02/08 18:20:43 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/02/08 18:20:43 INFO mapred.JobClient:     Reduce output records=20
12/02/08 18:20:43 INFO mapred.JobClient:     Spilled Records=40
12/02/08 18:20:43 INFO mapred.JobClient:     Map output bytes=90
12/02/08 18:20:43 INFO mapred.JobClient:     Combine input records=0
12/02/08 18:20:43 INFO mapred.JobClient:     Map output records=20
12/02/08 18:20:43 INFO mapred.JobClient:     SPLIT_RAW_BYTES=156
12/02/08 18:20:43 INFO mapred.JobClient:     Reduce input records=20
12/02/08 18:20:43 INFO dirichlet.DirichletDriver: Iteration 11
12/02/08 18:20:43 INFO input.FileInputFormat: Total input paths to process : 1
12/02/08 18:20:43 INFO mapred.JobClient: Running job: job_local_0011
12/02/08 18:20:43 INFO mapred.MapTask: io.sort.mb = 100
12/02/08 18:20:44 INFO mapred.MapTask: data buffer = 79691776/99614720
12/02/08 18:20:44 INFO mapred.MapTask: record buffer = 262144/327680
12/02/08 18:20:44 INFO mapred.MapTask: Starting flush of map output
12/02/08 18:20:44 INFO mapred.MapTask: Finished spill 0
12/02/08 18:20:44 INFO mapred.Task: Task:attempt_local_0011_m_000000_0 is done. And is in the process of commiting
12/02/08 18:20:44 INFO mapred.JobClient:  map 0% reduce 0%
12/02/08 18:20:46 INFO mapred.LocalJobRunner: 
12/02/08 18:20:46 INFO mapred.Task: Task 'attempt_local_0011_m_000000_0' done.
12/02/08 18:20:46 INFO mapred.LocalJobRunner: 
12/02/08 18:20:46 INFO mapred.Merger: Merging 1 sorted segments
12/02/08 18:20:46 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 132 bytes
12/02/08 18:20:46 INFO mapred.LocalJobRunner: 
12/02/08 18:20:46 INFO mapred.Task: Task:attempt_local_0011_r_000000_0 is done. And is in the process of commiting
12/02/08 18:20:46 INFO mapred.LocalJobRunner: 
12/02/08 18:20:46 INFO mapred.Task: Task attempt_local_0011_r_000000_0 is allowed to commit now
12/02/08 18:20:46 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0011_r_000000_0' to /tmp/mahout-work-hudson/reuters-dirichlet/clusters-11
12/02/08 18:20:46 INFO mapred.JobClient:  map 100% reduce 0%
12/02/08 18:20:49 INFO mapred.LocalJobRunner: reduce > reduce
12/02/08 18:20:49 INFO mapred.Task: Task 'attempt_local_0011_r_000000_0' done.
12/02/08 18:20:49 INFO mapred.JobClient:  map 100% reduce 100%
12/02/08 18:20:49 INFO mapred.JobClient: Job complete: job_local_0011
12/02/08 18:20:49 INFO mapred.JobClient: Counters: 16
12/02/08 18:20:49 INFO mapred.JobClient:   File Output Format Counters 
12/02/08 18:20:49 INFO mapred.JobClient:     Bytes Written=3090
12/02/08 18:20:49 INFO mapred.JobClient:   FileSystemCounters
12/02/08 18:20:49 INFO mapred.JobClient:     FILE_BYTES_READ=519287390
12/02/08 18:20:49 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=524023044
12/02/08 18:20:49 INFO mapred.JobClient:   File Input Format Counters 
12/02/08 18:20:49 INFO mapred.JobClient:     Bytes Read=102
12/02/08 18:20:49 INFO mapred.JobClient:   Map-Reduce Framework
12/02/08 18:20:49 INFO mapred.JobClient:     Reduce input groups=20
12/02/08 18:20:49 INFO mapred.JobClient:     Map output materialized bytes=136
12/02/08 18:20:49 INFO mapred.JobClient:     Combine output records=0
12/02/08 18:20:49 INFO mapred.JobClient:     Map input records=0
12/02/08 18:20:49 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/02/08 18:20:49 INFO mapred.JobClient:     Reduce output records=20
12/02/08 18:20:49 INFO mapred.JobClient:     Spilled Records=40
12/02/08 18:20:49 INFO mapred.JobClient:     Map output bytes=90
12/02/08 18:20:49 INFO mapred.JobClient:     Combine input records=0
12/02/08 18:20:49 INFO mapred.JobClient:     Map output records=20
12/02/08 18:20:49 INFO mapred.JobClient:     SPLIT_RAW_BYTES=156
12/02/08 18:20:49 INFO mapred.JobClient:     Reduce input records=20
12/02/08 18:20:49 INFO dirichlet.DirichletDriver: Iteration 12
12/02/08 18:20:50 INFO input.FileInputFormat: Total input paths to process : 1
12/02/08 18:20:50 INFO mapred.JobClient: Running job: job_local_0012
12/02/08 18:20:50 INFO mapred.MapTask: io.sort.mb = 100
12/02/08 18:20:50 INFO mapred.MapTask: data buffer = 79691776/99614720
12/02/08 18:20:50 INFO mapred.MapTask: record buffer = 262144/327680
12/02/08 18:20:50 INFO mapred.MapTask: Starting flush of map output
12/02/08 18:20:50 INFO mapred.MapTask: Finished spill 0
12/02/08 18:20:50 INFO mapred.Task: Task:attempt_local_0012_m_000000_0 is done. And is in the process of commiting
12/02/08 18:20:51 INFO mapred.JobClient:  map 0% reduce 0%
12/02/08 18:20:53 INFO mapred.LocalJobRunner: 
12/02/08 18:20:53 INFO mapred.Task: Task 'attempt_local_0012_m_000000_0' done.
12/02/08 18:20:53 INFO mapred.LocalJobRunner: 
12/02/08 18:20:53 INFO mapred.Merger: Merging 1 sorted segments
12/02/08 18:20:53 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 132 bytes
12/02/08 18:20:53 INFO mapred.LocalJobRunner: 
12/02/08 18:20:53 INFO mapred.Task: Task:attempt_local_0012_r_000000_0 is done. And is in the process of commiting
12/02/08 18:20:53 INFO mapred.LocalJobRunner: 
12/02/08 18:20:53 INFO mapred.Task: Task attempt_local_0012_r_000000_0 is allowed to commit now
12/02/08 18:20:53 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0012_r_000000_0' to /tmp/mahout-work-hudson/reuters-dirichlet/clusters-12
12/02/08 18:20:53 INFO mapred.JobClient:  map 100% reduce 0%
12/02/08 18:20:56 INFO mapred.LocalJobRunner: reduce > reduce
12/02/08 18:20:56 INFO mapred.Task: Task 'attempt_local_0012_r_000000_0' done.
12/02/08 18:20:56 INFO mapred.JobClient:  map 100% reduce 100%
12/02/08 18:20:56 INFO mapred.JobClient: Job complete: job_local_0012
12/02/08 18:20:56 INFO mapred.JobClient: Counters: 16
12/02/08 18:20:56 INFO mapred.JobClient:   File Output Format Counters 
12/02/08 18:20:56 INFO mapred.JobClient:     Bytes Written=3090
12/02/08 18:20:56 INFO mapred.JobClient:   FileSystemCounters
12/02/08 18:20:56 INFO mapred.JobClient:     FILE_BYTES_READ=566494802
12/02/08 18:20:56 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=571660838
12/02/08 18:20:56 INFO mapred.JobClient:   File Input Format Counters 
12/02/08 18:20:56 INFO mapred.JobClient:     Bytes Read=102
12/02/08 18:20:56 INFO mapred.JobClient:   Map-Reduce Framework
12/02/08 18:20:56 INFO mapred.JobClient:     Reduce input groups=20
12/02/08 18:20:56 INFO mapred.JobClient:     Map output materialized bytes=136
12/02/08 18:20:56 INFO mapred.JobClient:     Combine output records=0
12/02/08 18:20:56 INFO mapred.JobClient:     Map input records=0
12/02/08 18:20:56 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/02/08 18:20:56 INFO mapred.JobClient:     Reduce output records=20
12/02/08 18:20:56 INFO mapred.JobClient:     Spilled Records=40
12/02/08 18:20:56 INFO mapred.JobClient:     Map output bytes=90
12/02/08 18:20:56 INFO mapred.JobClient:     Combine input records=0
12/02/08 18:20:56 INFO mapred.JobClient:     Map output records=20
12/02/08 18:20:56 INFO mapred.JobClient:     SPLIT_RAW_BYTES=156
12/02/08 18:20:56 INFO mapred.JobClient:     Reduce input records=20
12/02/08 18:20:56 INFO dirichlet.DirichletDriver: Iteration 13
12/02/08 18:20:56 INFO input.FileInputFormat: Total input paths to process : 1
12/02/08 18:20:56 INFO mapred.JobClient: Running job: job_local_0013
12/02/08 18:20:56 INFO mapred.MapTask: io.sort.mb = 100
12/02/08 18:20:57 INFO mapred.MapTask: data buffer = 79691776/99614720
12/02/08 18:20:57 INFO mapred.MapTask: record buffer = 262144/327680
12/02/08 18:20:57 INFO mapred.MapTask: Starting flush of map output
12/02/08 18:20:57 INFO mapred.MapTask: Finished spill 0
12/02/08 18:20:57 INFO mapred.Task: Task:attempt_local_0013_m_000000_0 is done. And is in the process of commiting
12/02/08 18:20:57 INFO mapred.JobClient:  map 0% reduce 0%
12/02/08 18:20:59 INFO mapred.LocalJobRunner: 
12/02/08 18:20:59 INFO mapred.Task: Task 'attempt_local_0013_m_000000_0' done.
12/02/08 18:20:59 INFO mapred.LocalJobRunner: 
12/02/08 18:20:59 INFO mapred.Merger: Merging 1 sorted segments
12/02/08 18:20:59 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 132 bytes
12/02/08 18:20:59 INFO mapred.LocalJobRunner: 
12/02/08 18:20:59 INFO mapred.JobClient:  map 100% reduce 0%
12/02/08 18:20:59 INFO mapred.Task: Task:attempt_local_0013_r_000000_0 is done. And is in the process of commiting
12/02/08 18:20:59 INFO mapred.LocalJobRunner: 
12/02/08 18:20:59 INFO mapred.Task: Task attempt_local_0013_r_000000_0 is allowed to commit now
12/02/08 18:20:59 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0013_r_000000_0' to /tmp/mahout-work-hudson/reuters-dirichlet/clusters-13
12/02/08 18:21:02 INFO mapred.LocalJobRunner: reduce > reduce
12/02/08 18:21:02 INFO mapred.Task: Task 'attempt_local_0013_r_000000_0' done.
12/02/08 18:21:02 INFO mapred.JobClient:  map 100% reduce 100%
12/02/08 18:21:02 INFO mapred.JobClient: Job complete: job_local_0013
12/02/08 18:21:02 INFO mapred.JobClient: Counters: 16
12/02/08 18:21:02 INFO mapred.JobClient:   File Output Format Counters 
12/02/08 18:21:02 INFO mapred.JobClient:     Bytes Written=3090
12/02/08 18:21:02 INFO mapred.JobClient:   FileSystemCounters
12/02/08 18:21:02 INFO mapred.JobClient:     FILE_BYTES_READ=613702214
12/02/08 18:21:02 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=619298632
12/02/08 18:21:02 INFO mapred.JobClient:   File Input Format Counters 
12/02/08 18:21:02 INFO mapred.JobClient:     Bytes Read=102
12/02/08 18:21:02 INFO mapred.JobClient:   Map-Reduce Framework
12/02/08 18:21:02 INFO mapred.JobClient:     Reduce input groups=20
12/02/08 18:21:02 INFO mapred.JobClient:     Map output materialized bytes=136
12/02/08 18:21:02 INFO mapred.JobClient:     Combine output records=0
12/02/08 18:21:02 INFO mapred.JobClient:     Map input records=0
12/02/08 18:21:02 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/02/08 18:21:02 INFO mapred.JobClient:     Reduce output records=20
12/02/08 18:21:02 INFO mapred.JobClient:     Spilled Records=40
12/02/08 18:21:02 INFO mapred.JobClient:     Map output bytes=90
12/02/08 18:21:02 INFO mapred.JobClient:     Combine input records=0
12/02/08 18:21:02 INFO mapred.JobClient:     Map output records=20
12/02/08 18:21:02 INFO mapred.JobClient:     SPLIT_RAW_BYTES=156
12/02/08 18:21:02 INFO mapred.JobClient:     Reduce input records=20
12/02/08 18:21:02 INFO dirichlet.DirichletDriver: Iteration 14
12/02/08 18:21:02 INFO input.FileInputFormat: Total input paths to process : 1
12/02/08 18:21:02 INFO mapred.JobClient: Running job: job_local_0014
12/02/08 18:21:02 WARN mapred.LocalJobRunner: job_local_0014
java.lang.ClassCastException: org.apache.hadoop.mapreduce.lib.input.FileSplit cannot be cast to org.apache.hadoop.mapred.InputSplit
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:412)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
	at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
12/02/08 18:21:03 INFO mapred.JobClient:  map 0% reduce 0%
12/02/08 18:21:03 INFO mapred.JobClient: Job complete: job_local_0014
12/02/08 18:21:03 INFO mapred.JobClient: Counters: 0
Exception in thread "main" java.lang.InterruptedException: Dirichlet Iteration failed processing /tmp/mahout-work-hudson/reuters-dirichlet/clusters-13
	at org.apache.mahout.clustering.dirichlet.DirichletDriver.runIteration(DirichletDriver.java:327)
	at org.apache.mahout.clustering.dirichlet.DirichletDriver.buildClustersMR(DirichletDriver.java:431)
	at org.apache.mahout.clustering.dirichlet.DirichletDriver.buildClusters(DirichletDriver.java:364)
	at org.apache.mahout.clustering.dirichlet.DirichletDriver.run(DirichletDriver.java:167)
	at org.apache.mahout.clustering.dirichlet.DirichletDriver.run(DirichletDriver.java:118)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
	at org.apache.mahout.clustering.dirichlet.DirichletDriver.main(DirichletDriver.java:68)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
	at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
	at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188)
Build step 'Execute shell' marked build as failure


Jenkins build is back to normal : Mahout-Examples-Cluster-Reuters-II #37

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters-II/37/changes>