You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2013/01/18 09:15:47 UTC
Build failed in Jenkins: Mahout-Examples-Classify-20News #129
See <https://builds.apache.org/job/Mahout-Examples-Classify-20News/129/changes>
Changes:
[ssc] MAHOUT-1062 alphaI is not correctly saved in NaiveBayesModel
[ssc] MAHOUT-1077 apparent spectral kmeans bug
------------------------------------------
[...truncated 6736 lines...]
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] [source:jar-no-fork {execution: attach-sources}]
[INFO] Building jar: <https://builds.apache.org/job/Mahout-Examples-Classify-20News/ws/trunk/examples/target/mahout-examples-0.8-SNAPSHOT-sources.jar>
[INFO] [install:install {execution: default-install}]
[INFO] Installing <https://builds.apache.org/job/Mahout-Examples-Classify-20News/ws/trunk/examples/target/mahout-examples-0.8-SNAPSHOT.jar> to /home/hudson/.m2/repository/org/apache/mahout/mahout-examples/0.8-SNAPSHOT/mahout-examples-0.8-SNAPSHOT.jar
[INFO] Installing <https://builds.apache.org/job/Mahout-Examples-Classify-20News/ws/trunk/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar> to /home/hudson/.m2/repository/org/apache/mahout/mahout-examples/0.8-SNAPSHOT/mahout-examples-0.8-SNAPSHOT-job.jar
[INFO] Installing <https://builds.apache.org/job/Mahout-Examples-Classify-20News/ws/trunk/examples/target/mahout-examples-0.8-SNAPSHOT-sources.jar> to /home/hudson/.m2/repository/org/apache/mahout/mahout-examples/0.8-SNAPSHOT/mahout-examples-0.8-SNAPSHOT-sources.jar
[INFO] ------------------------------------------------------------------------
[INFO] Building Mahout Release Package
[INFO] task-segment: [clean, install]
[INFO] ------------------------------------------------------------------------
[INFO] [clean:clean {execution: default-clean}]
[INFO] [site:attach-descriptor {execution: default-attach-descriptor}]
[INFO] [assembly:single {execution: bin-assembly}]
[INFO] Assemblies have been skipped per configuration of the skipAssembly parameter.
[INFO] [assembly:single {execution: src-assembly}]
[INFO] Assemblies have been skipped per configuration of the skipAssembly parameter.
[INFO] [install:install {execution: default-install}]
[INFO] Installing <https://builds.apache.org/job/Mahout-Examples-Classify-20News/ws/trunk/distribution/pom.xml> to /home/hudson/.m2/repository/org/apache/mahout/mahout-distribution/0.8-SNAPSHOT/mahout-distribution-0.8-SNAPSHOT.pom
[INFO]
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] ------------------------------------------------------------------------
[INFO] Apache Mahout ......................................... SUCCESS [5.924s]
[INFO] Mahout Build Tools .................................... SUCCESS [1.898s]
[INFO] Mahout Math ........................................... SUCCESS [15.017s]
[INFO] Mahout Core ........................................... SUCCESS [18.719s]
[INFO] Mahout Integration .................................... SUCCESS [16.864s]
[INFO] Mahout Examples ....................................... SUCCESS [24.550s]
[INFO] Mahout Release Package ................................ SUCCESS [0.035s]
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1 minute 23 seconds
[INFO] Finished at: Fri Jan 18 08:13:26 UTC 2013
[INFO] Final Memory: 85M/421M
[INFO] ------------------------------------------------------------------------
[Mahout-Examples-Classify-20News] $ /bin/bash -xe /tmp/hudson6979638664122356868.sh
+ cd trunk
+ echo 3
+ ./examples/bin/classify-20newsgroups.sh
Please select a number to choose the corresponding task to run
1. cnaivebayes
2. naivebayes
3. sgd
4. clean -- cleans up the work area in /tmp/mahout-work-hudson
ok. You chose 3 and we'll use sgd
creating work directory at /tmp/mahout-work-hudson
Testing on /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/ with model: /tmp/news-group.model
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally
13/01/18 08:13:28 WARN driver.MahoutDriver: No org.apache.mahout.classifier.sgd.TestNewsGroups.props found on classpath, will use command-line arguments only
7532 test files
=======================================================
Summary
-------------------------------------------------------
Correctly Classified Instances : 5497 72.9819%
Incorrectly Classified Instances : 2035 27.0181%
Total Classified Instances : 7532
=======================================================
Confusion Matrix
-------------------------------------------------------
a b c d e f g h i j k l m n o p q r s t <--Classified as
37 3 4 3 49 8 41 8 76 29 1 7 9 12 4 3 5 7 77 6 | 389 a = comp.graphics
0 260 5 3 9 6 1 5 0 1 3 12 2 9 6 5 18 16 2 1 | 364 b = talk.politics.guns
0 0 345 1 7 1 2 1 4 3 2 0 20 0 2 0 4 2 1 2 | 397 c = rec.sport.baseball
0 4 3 315 24 3 9 2 3 2 12 1 0 1 1 0 3 0 3 10 | 396 d = rec.autos
1 0 1 8 280 3 23 4 5 7 8 14 0 3 3 1 1 0 27 4 | 393 e = sci.electronics
1 0 1 0 6 326 2 2 2 1 0 1 1 5 10 1 1 33 5 0 | 398 f = soc.religion.christian
0 0 2 1 21 0 303 0 3 7 2 2 1 3 0 0 0 2 32 6 | 385 g = comp.sys.mac.hardware
0 0 0 2 13 9 4 341 4 2 0 0 1 11 1 0 3 1 2 0 | 394 h = sci.space
5 1 1 5 6 6 8 4 297 28 2 2 4 1 0 0 1 2 14 8 | 395 i = comp.windows.x
4 0 1 1 10 2 24 2 30 247 4 3 1 2 8 0 3 3 49 0 | 394 j = comp.os.ms-windows.misc
0 1 4 16 2 1 1 1 1 1 356 0 0 5 1 0 1 0 4 3 | 398 k = rec.motorcycles
1 4 3 0 8 6 7 4 5 3 0 340 2 2 2 0 6 2 1 0 | 396 l = sci.crypt
0 0 25 0 2 3 2 1 0 3 2 0 354 0 1 0 1 2 2 1 | 399 m = rec.sport.hockey
1 3 1 6 27 16 6 3 7 1 7 1 6 285 7 0 1 3 10 5 | 396 n = sci.med
0 1 2 1 4 31 0 5 0 0 0 5 2 10 213 2 5 35 3 0 | 319 o = alt.atheism
0 2 5 4 2 8 1 1 3 0 2 2 1 0 19 310 12 3 1 0 | 376 p = talk.politics.mideast
0 88 0 0 1 4 1 6 1 0 2 4 0 5 6 4 176 9 2 1 | 310 q = talk.politics.misc
1 14 1 3 3 27 3 10 2 1 0 0 1 6 35 3 8 131 2 0 | 251 r = talk.religion.misc
2 0 0 6 28 0 36 1 9 31 1 1 1 1 0 0 1 0 270 4 | 392 s = comp.sys.ibm.pc.hardware
1 0 2 7 14 1 15 1 3 1 2 0 4 2 0 0 0 1 25 311 | 390 t = misc.forsale
Avg. Log-likelihood: -1.1294048800549976 25%-ile: -1.6585551628361608 75%-ile: -0.5713756311044645
13/01/18 08:14:30 INFO driver.MahoutDriver: Program took 62590 ms (Minutes: 1.0431666666666666)
+ echo 2
+ ./examples/bin/classify-20newsgroups.sh
Please select a number to choose the corresponding task to run
1. cnaivebayes
2. naivebayes
3. sgd
4. clean -- cleans up the work area in /tmp/mahout-work-hudson
ok. You chose 2 and we'll use naivebayes
creating work directory at /tmp/mahout-work-hudson
+ echo 'Preparing 20newsgroups data'
Preparing 20newsgroups data
+ rm -rf /tmp/mahout-work-hudson/20news-all
+ mkdir /tmp/mahout-work-hudson/20news-all
+ cp -R /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/alt.atheism /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/comp.graphics /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/comp.os.ms-windows.misc /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/comp.sys.ibm.pc.hardware /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/comp.sys.mac.hardware /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/comp.windows.x /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/misc.forsale /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/rec.autos /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/rec.motorcycles /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/rec.sport.baseball /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/rec.sport.hockey /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/sci.crypt /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/sci.electronics /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/sci.med /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/sci.space /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/soc.religion.christian /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/talk.politics.guns /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/talk.politics.mideast /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/talk.politics.misc /tmp/mahout-work-hudson/20news-bydate/20news-bydate-test/talk.religion.misc /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/alt.atheism /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/comp.graphics /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/comp.os.ms-windows.misc /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/comp.sys.ibm.pc.hardware /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/comp.sys.mac.hardware /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/comp.windows.x /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/misc.forsale /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/rec.autos /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/rec.motorcycles /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/rec.sport.baseball /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/rec.sport.hockey /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/sci.crypt /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/sci.electronics /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/sci.med /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/sci.space /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/soc.religion.christian /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/talk.politics.guns /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/talk.politics.mideast /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/talk.politics.misc /tmp/mahout-work-hudson/20news-bydate/20news-bydate-train/talk.religion.misc /tmp/mahout-work-hudson/20news-all
+ echo 'Creating sequence files from 20newsgroups data'
Creating sequence files from 20newsgroups data
+ ./bin/mahout seqdirectory -i /tmp/mahout-work-hudson/20news-all -o /tmp/mahout-work-hudson/20news-seq
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally
13/01/18 08:15:24 INFO common.AbstractJob: Command line arguments: {--charset=[UTF-8], --chunkSize=[64], --endPhase=[2147483647], --fileFilterClass=[org.apache.mahout.text.PrefixAdditionFilter], --input=[/tmp/mahout-work-hudson/20news-all], --keyPrefix=[], --output=[/tmp/mahout-work-hudson/20news-seq], --startPhase=[0], --tempDir=[temp]}
13/01/18 08:15:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/01/18 08:15:29 INFO driver.MahoutDriver: Program took 5238 ms (Minutes: 0.0873)
+ echo 'Converting sequence files to vectors'
Converting sequence files to vectors
+ ./bin/mahout seq2sparse -i /tmp/mahout-work-hudson/20news-seq -o /tmp/mahout-work-hudson/20news-vectors -lnorm -nv -wt tfidf
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally
13/01/18 08:15:30 INFO vectorizer.SparseVectorsFromSequenceFiles: Maximum n-gram size is: 1
13/01/18 08:15:30 INFO vectorizer.SparseVectorsFromSequenceFiles: Minimum LLR value: 1.0
13/01/18 08:15:30 INFO vectorizer.SparseVectorsFromSequenceFiles: Number of reduce tasks: 1
13/01/18 08:15:30 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/20news-vectors/tokenized-documents
13/01/18 08:15:30 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/01/18 08:15:31 INFO input.FileInputFormat: Total input paths to process : 1
13/01/18 08:15:31 INFO mapred.JobClient: Running job: job_local_0001
13/01/18 08:15:31 INFO util.ProcessTree: setsid exited with exit code 0
13/01/18 08:15:31 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1f4e571
13/01/18 08:15:32 INFO mapred.JobClient: map 0% reduce 0%
13/01/18 08:15:35 INFO mapred.Task: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
13/01/18 08:15:35 INFO mapred.LocalJobRunner:
13/01/18 08:15:35 INFO mapred.Task: Task attempt_local_0001_m_000000_0 is allowed to commit now
13/01/18 08:15:35 INFO output.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to /tmp/mahout-work-hudson/20news-vectors/tokenized-documents
13/01/18 08:15:37 INFO mapred.LocalJobRunner:
13/01/18 08:15:37 INFO mapred.LocalJobRunner:
13/01/18 08:15:37 INFO mapred.Task: Task 'attempt_local_0001_m_000000_0' done.
13/01/18 08:15:38 INFO mapred.JobClient: map 100% reduce 0%
13/01/18 08:15:38 INFO mapred.JobClient: Job complete: job_local_0001
13/01/18 08:15:38 INFO mapred.JobClient: Counters: 12
13/01/18 08:15:38 INFO mapred.JobClient: File Output Format Counters
13/01/18 08:15:38 INFO mapred.JobClient: Bytes Written=27717536
13/01/18 08:15:38 INFO mapred.JobClient: File Input Format Counters
13/01/18 08:15:38 INFO mapred.JobClient: Bytes Read=36979845
13/01/18 08:15:38 INFO mapred.JobClient: FileSystemCounters
13/01/18 08:15:38 INFO mapred.JobClient: FILE_BYTES_READ=98546329
13/01/18 08:15:38 INFO mapred.JobClient: FILE_BYTES_WRITTEN=89797879
13/01/18 08:15:38 INFO mapred.JobClient: Map-Reduce Framework
13/01/18 08:15:38 INFO mapred.JobClient: Map input records=18846
13/01/18 08:15:38 INFO mapred.JobClient: Physical memory (bytes) snapshot=0
13/01/18 08:15:38 INFO mapred.JobClient: Spilled Records=0
13/01/18 08:15:38 INFO mapred.JobClient: Total committed heap usage (bytes)=124256256
13/01/18 08:15:38 INFO mapred.JobClient: CPU time spent (ms)=0
13/01/18 08:15:38 INFO mapred.JobClient: Virtual memory (bytes) snapshot=0
13/01/18 08:15:38 INFO mapred.JobClient: SPLIT_RAW_BYTES=112
13/01/18 08:15:38 INFO mapred.JobClient: Map output records=18846
13/01/18 08:15:38 INFO common.HadoopUtil: Deleting /tmp/mahout-work-hudson/20news-vectors/wordcount
13/01/18 08:15:39 INFO input.FileInputFormat: Total input paths to process : 1
13/01/18 08:15:39 INFO mapred.JobClient: Running job: job_local_0002
13/01/18 08:15:39 INFO mapred.Task: Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1cbda0a
13/01/18 08:15:39 INFO mapred.MapTask: io.sort.mb = 100
13/01/18 08:15:39 INFO mapred.MapTask: data buffer = 79691776/99614720
13/01/18 08:15:39 INFO mapred.MapTask: record buffer = 262144/327680
13/01/18 08:15:39 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:39 INFO mapred.MapTask: bufstart = 0; bufend = 3911616; bufvoid = 99614720
13/01/18 08:15:39 INFO mapred.MapTask: kvstart = 0; kvend = 262144; length = 327680
13/01/18 08:15:40 INFO mapred.JobClient: map 0% reduce 0%
13/01/18 08:15:40 INFO mapred.MapTask: Finished spill 0
13/01/18 08:15:40 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:40 INFO mapred.MapTask: bufstart = 3911616; bufend = 7735866; bufvoid = 99614720
13/01/18 08:15:40 INFO mapred.MapTask: kvstart = 262144; kvend = 196607; length = 327680
13/01/18 08:15:41 INFO mapred.MapTask: Finished spill 1
13/01/18 08:15:41 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:41 INFO mapred.MapTask: bufstart = 7735866; bufend = 11637101; bufvoid = 99614720
13/01/18 08:15:41 INFO mapred.MapTask: kvstart = 196607; kvend = 131070; length = 327680
13/01/18 08:15:41 INFO mapred.MapTask: Finished spill 2
13/01/18 08:15:42 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:42 INFO mapred.MapTask: bufstart = 11637101; bufend = 15530549; bufvoid = 99614720
13/01/18 08:15:42 INFO mapred.MapTask: kvstart = 131070; kvend = 65533; length = 327680
13/01/18 08:15:42 INFO mapred.MapTask: Finished spill 3
13/01/18 08:15:42 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:42 INFO mapred.MapTask: bufstart = 15530549; bufend = 19333514; bufvoid = 99614720
13/01/18 08:15:42 INFO mapred.MapTask: kvstart = 65533; kvend = 327677; length = 327680
13/01/18 08:15:43 INFO mapred.MapTask: Finished spill 4
13/01/18 08:15:43 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:43 INFO mapred.MapTask: bufstart = 19333514; bufend = 23189462; bufvoid = 99614720
13/01/18 08:15:43 INFO mapred.MapTask: kvstart = 327677; kvend = 262140; length = 327680
13/01/18 08:15:43 INFO mapred.MapTask: Finished spill 5
13/01/18 08:15:44 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:44 INFO mapred.MapTask: bufstart = 23189462; bufend = 27070371; bufvoid = 99614720
13/01/18 08:15:44 INFO mapred.MapTask: kvstart = 262140; kvend = 196603; length = 327680
13/01/18 08:15:44 INFO mapred.MapTask: Finished spill 6
13/01/18 08:15:44 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:44 INFO mapred.MapTask: bufstart = 27070371; bufend = 31027903; bufvoid = 99614720
13/01/18 08:15:44 INFO mapred.MapTask: kvstart = 196603; kvend = 131066; length = 327680
13/01/18 08:15:45 INFO mapred.MapTask: Finished spill 7
13/01/18 08:15:45 INFO mapred.LocalJobRunner:
13/01/18 08:15:45 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:45 INFO mapred.MapTask: bufstart = 31027903; bufend = 34986850; bufvoid = 99614720
13/01/18 08:15:45 INFO mapred.MapTask: kvstart = 131066; kvend = 65529; length = 327680
13/01/18 08:15:45 INFO mapred.MapTask: Finished spill 8
13/01/18 08:15:46 INFO mapred.MapTask: Spilling map output: record full = true
13/01/18 08:15:46 INFO mapred.MapTask: bufstart = 34986850; bufend = 38843088; bufvoid = 99614720
13/01/18 08:15:46 INFO mapred.MapTask: kvstart = 65529; kvend = 327673; length = 327680
13/01/18 08:15:46 INFO mapred.MapTask: Starting flush of map output
13/01/18 08:15:46 INFO mapred.JobClient: map 82% reduce 0%
13/01/18 08:15:46 INFO mapred.MapTask: Finished spill 9
13/01/18 08:15:46 INFO mapred.MapTask: Finished spill 10
13/01/18 08:15:46 WARN mapred.LocalJobRunner: job_local_0002
org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not find output/spill0.out in any of the configured local directories
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathToRead(LocalDirAllocator.java:429)
at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathToRead(LocalDirAllocator.java:160)
at org.apache.hadoop.mapred.MapOutputFile.getSpillFile(MapOutputFile.java:107)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.mergeParts(MapTask.java:1614)
at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1323)
at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:699)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)
13/01/18 08:15:47 INFO mapred.JobClient: Job complete: job_local_0002
13/01/18 08:15:47 INFO mapred.JobClient: Counters: 15
13/01/18 08:15:47 INFO mapred.JobClient: File Input Format Counters
13/01/18 08:15:47 INFO mapred.JobClient: Bytes Read=22745088
13/01/18 08:15:47 INFO mapred.JobClient: FileSystemCounters
13/01/18 08:15:47 INFO mapred.JobClient: FILE_BYTES_READ=182853834
13/01/18 08:15:47 INFO mapred.JobClient: FILE_BYTES_WRITTEN=157258391
13/01/18 08:15:47 INFO mapred.JobClient: Map-Reduce Framework
13/01/18 08:15:47 INFO mapred.JobClient: Map output materialized bytes=0
13/01/18 08:15:47 INFO mapred.JobClient: Combine output records=297888
13/01/18 08:15:47 INFO mapred.JobClient: Map input records=15325
13/01/18 08:15:47 INFO mapred.JobClient: Physical memory (bytes) snapshot=0
13/01/18 08:15:47 INFO mapred.JobClient: Spilled Records=297888
13/01/18 08:15:47 INFO mapred.JobClient: Map output bytes=32409862
13/01/18 08:15:47 INFO mapred.JobClient: Total committed heap usage (bytes)=285605888
13/01/18 08:15:47 INFO mapred.JobClient: CPU time spent (ms)=0
13/01/18 08:15:47 INFO mapred.JobClient: Virtual memory (bytes) snapshot=0
13/01/18 08:15:47 INFO mapred.JobClient: SPLIT_RAW_BYTES=141
13/01/18 08:15:47 INFO mapred.JobClient: Map output records=2188428
13/01/18 08:15:47 INFO mapred.JobClient: Combine input records=2097146
Exception in thread "main" java.lang.IllegalStateException: Job failed!
at org.apache.mahout.vectorizer.DictionaryVectorizer.startWordCounting(DictionaryVectorizer.java:360)
at org.apache.mahout.vectorizer.DictionaryVectorizer.createTermFrequencyVectors(DictionaryVectorizer.java:171)
at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:272)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
Build step 'Execute shell' marked build as failure
Jenkins build is back to normal : Mahout-Examples-Classify-20News
#130
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mahout-Examples-Classify-20News/130/changes>