You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by Pat Ferrel <pa...@occamsmachete.com> on 2015/04/14 20:26:42 UTC

Error in AtA

Running cooccurrence calc on Spark 1.1.1 Hadoop 2.4 with Yarn

This isn’t a problem in our code is it? I’ve never run into a TaskResultLost, not sure what can cause that.

TaskResultLost (result lost from block manager)

nivea.m <https://gd-a.slack.com/team/nivea.m>[11:01 AM]
collect at AtA.scala:121    97/213 (25 failed)

nivea.m <https://gd-a.slack.com/team/nivea.m>[11:01 AM]
org.apache.spark.rdd.RDD.collect(RDD.scala:774)
org.apache.mahout.sparkbindings.blas.AtA$.at_a_slim(AtA.scala:121)
org.apache.mahout.sparkbindings.blas.AtA$.at_a(AtA.scala:50)
org.apache.mahout.sparkbindings.SparkEngine$.tr2phys(SparkEngine.scala:231)
org.apache.mahout.sparkbindings.SparkEngine$.tr2phys(SparkEngine.scala:242)
org.apache.mahout.sparkbindings.SparkEngine$.toPhysical(SparkEngine.scala:108)
org.apache.mahout.math.drm.logical.CheckpointAction.checkpoint(CheckpointAction.scala:40)
org.apache.mahout.math.drm.package$.drm2Checkpointed(package.scala:90)
org.apache.mahout.math.cf.SimilarityAnalysis$$anonfun$3.apply(SimilarityAnalysis.scala:129)
org.apache.mahout.math.cf.SimilarityAnalysis$$anonfun$3.apply(SimilarityAnalysis.scala:127)
scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
scala.collection.Iterator$class.foreach(Iterator.scala:727)
scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:176)
scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:45)
scala.collection.TraversableOnce$class.to <http://class.to/>(TraversableOnce.scala:273)
scala.collection.AbstractIterator.to <http://scala.collection.abstractiterator.to/>(Iterator.scala:1157)
scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:257)
scala.collection.AbstractIterator.toList(Iterator.scala:1157)

Re: Error in AtA

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
you need to check task log.

On Tue, Apr 14, 2015 at 11:26 AM, Pat Ferrel <pa...@occamsmachete.com> wrote:

> Running cooccurrence calc on Spark 1.1.1 Hadoop 2.4 with Yarn
>
> This isn’t a problem in our code is it? I’ve never run into a
> TaskResultLost, not sure what can cause that.
>
> TaskResultLost (result lost from block manager)
>
> nivea.m <https://gd-a.slack.com/team/nivea.m>[11:01 AM]
> collect at AtA.scala:121    97/213 (25 failed)
>
> nivea.m <https://gd-a.slack.com/team/nivea.m>[11:01 AM]
> org.apache.spark.rdd.RDD.collect(RDD.scala:774)
> org.apache.mahout.sparkbindings.blas.AtA$.at_a_slim(AtA.scala:121)
> org.apache.mahout.sparkbindings.blas.AtA$.at_a(AtA.scala:50)
> org.apache.mahout.sparkbindings.SparkEngine$.tr2phys(SparkEngine.scala:231)
> org.apache.mahout.sparkbindings.SparkEngine$.tr2phys(SparkEngine.scala:242)
>
> org.apache.mahout.sparkbindings.SparkEngine$.toPhysical(SparkEngine.scala:108)
>
> org.apache.mahout.math.drm.logical.CheckpointAction.checkpoint(CheckpointAction.scala:40)
> org.apache.mahout.math.drm.package$.drm2Checkpointed(package.scala:90)
>
> org.apache.mahout.math.cf.SimilarityAnalysis$$anonfun$3.apply(SimilarityAnalysis.scala:129)
>
> org.apache.mahout.math.cf.SimilarityAnalysis$$anonfun$3.apply(SimilarityAnalysis.scala:127)
> scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> scala.collection.Iterator$class.foreach(Iterator.scala:727)
> scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
> scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:176)
> scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:45)
> scala.collection.TraversableOnce$class.to <http://class.to/
> >(TraversableOnce.scala:273)
> scala.collection.AbstractIterator.to <
> http://scala.collection.abstractiterator.to/>(Iterator.scala:1157)
> scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:257)
> scala.collection.AbstractIterator.toList(Iterator.scala:1157)

TaskResultLost

Posted by Pat Ferrel <pa...@occamsmachete.com>.
Running on Spark 1.1.1 Hadoop 2.4 with Yarn AWS dedicated cluster (non-EMR)

Is this in our code or config? I’ve never run into a TaskResultLost, not sure what can cause that.


TaskResultLost (result lost from block manager)

nivea.m <https://gd-a.slack.com/team/nivea.m>[11:01 AM]
collect at AtA.scala:121    97/213 (25 failed)

nivea.m <https://gd-a.slack.com/team/nivea.m>[11:01 AM]
org.apache.spark.rdd.RDD.collect(RDD.scala:774)
org.apache.mahout.sparkbindings.blas.AtA$.at_a_slim(AtA.scala:121)
org.apache.mahout.sparkbindings.blas.AtA$.at_a(AtA.scala:50)
org.apache.mahout.sparkbindings.SparkEngine$.tr2phys(SparkEngine.scala:231)
org.apache.mahout.sparkbindings.SparkEngine$.tr2phys(SparkEngine.scala:242)
org.apache.mahout.sparkbindings.SparkEngine$.toPhysical(SparkEngine.scala:108)
org.apache.mahout.math.drm.logical.CheckpointAction.checkpoint(CheckpointAction.scala:40)
org.apache.mahout.math.drm.package$.drm2Checkpointed(package.scala:90)
org.apache.mahout.math.cf.SimilarityAnalysis$$anonfun$3.apply(SimilarityAnalysis.scala:129)
org.apache.mahout.math.cf.SimilarityAnalysis$$anonfun$3.apply(SimilarityAnalysis.scala:127)
scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
scala.collection.Iterator$class.foreach(Iterator.scala:727)
scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:176)
scala.collection.mutable.ListBuffer.$plus$plus$eq(ListBuffer.scala:45)
scala.collection.TraversableOnce$class.to <http://class.to/>(TraversableOnce.scala:273)
scala.collection.AbstractIterator.to <http://scala.collection.abstractiterator.to/>(Iterator.scala:1157)
scala.collection.TraversableOnce$class.toList(TraversableOnce.scala:257)
scala.collection.AbstractIterator.toList(Iterator.scala:1157)


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org