You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by bliang <bl...@thecarousell.com> on 2015/07/13 12:55:35 UTC

MovieALS Implicit Error

Hi,I am trying to run the MovieALS example with an implicit dataset and am
receiving this error:
Got 3856988 ratings from 144250 users on 378937 movies.Training: 3085522,
test: 771466.15/07/13 10:43:07 WARN BLAS: Failed to load implementation
from: com.github.fommil.netlib.NativeSystemBLAS15/07/13 10:43:07 WARN BLAS:
Failed to load implementation from:
com.github.fommil.netlib.NativeRefBLAS15/07/13 10:43:10 WARN TaskSetManager:
Lost task 3.0 in stage 29.0 (TID 192, 10.162.45.33):
java.lang.AssertionError: assertion failed: lapack.dppsv returned 1.	at
scala.Predef$.assert(Predef.scala:179)	at
org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at
org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)	at
org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)	at
org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:242)	at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)	at
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:244)	at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)	at
org.apache.spark.scheduler.Task.run(Task.scala:70)	at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)15/07/13 10:43:10 ERROR
TaskSetManager: Task 12 in stage 29.0 failed 4 times; aborting jobException
in thread "main" org.apache.spark.SparkException: Job aborted due to stage
failure: Task 12 in stage 29.0 failed 4 times, most recent failure: Lost
task 12.3 in stage 29.0 (TID 249, 10.162.45.33): java.lang.AssertionError:
assertion failed: lapack.dppsv returned 1.	at
scala.Predef$.assert(Predef.scala:179)	at
org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at
org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)	at
org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)	at
org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:242)	at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)	at
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:244)	at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)	at
org.apache.spark.scheduler.Task.run(Task.scala:70)	at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)Driver stacktrace:	at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)	at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)	at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Would it be possible to help me out?Thank you,Ben



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MovieALS-Implicit-Error-tp23793.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: MovieALS Implicit Error

Posted by Xiangrui Meng <me...@gmail.com>.
Hi Benedict, Did you set lambda to zero? -Xiangrui

On Mon, Jul 13, 2015 at 4:18 AM, Benedict Liang <bl...@thecarousell.com> wrote:
> Hi Sean,
>
> This user dataset is organic. What do you think is a good ratings threshold
> then? I am only encountering this with the implicit type though. The
> explicit type works fine though (though it is not suitable for this
> dataset).
>
> Thank you,
> Benedict
>
> On Mon, Jul 13, 2015 at 7:15 PM, Sean Owen <so...@cloudera.com> wrote:
>>
>> Is the data set synthetic, or has very few items? or is indeed very
>> sparse? those could be reasons. However usually this kind of thing
>> happens with very small data sets. I could be wrong about what's going
>> on, but it's a decent guess at the immediate cause given the error
>> messages.
>>
>> On Mon, Jul 13, 2015 at 12:12 PM, Benedict Liang
>> <bl...@thecarousell.com> wrote:
>> > Hi Sean,
>> >
>> > Thank you for your quick response. By very little data, do you mean that
>> > the
>> > matrix is too sparse? Or are there too little data points? There are
>> > 3856988
>> > ratings that are in my dataset currently.
>> >
>> > Regards,
>> > Benedict
>> >
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: MovieALS Implicit Error

Posted by Benedict Liang <bl...@thecarousell.com>.
Hi Sean,

This user dataset is organic. What do you think is a good ratings threshold
then? I am only encountering this with the implicit type though. The
explicit type works fine though (though it is not suitable for this
dataset).

Thank you,
Benedict

On Mon, Jul 13, 2015 at 7:15 PM, Sean Owen <so...@cloudera.com> wrote:

> Is the data set synthetic, or has very few items? or is indeed very
> sparse? those could be reasons. However usually this kind of thing
> happens with very small data sets. I could be wrong about what's going
> on, but it's a decent guess at the immediate cause given the error
> messages.
>
> On Mon, Jul 13, 2015 at 12:12 PM, Benedict Liang
> <bl...@thecarousell.com> wrote:
> > Hi Sean,
> >
> > Thank you for your quick response. By very little data, do you mean that
> the
> > matrix is too sparse? Or are there too little data points? There are
> 3856988
> > ratings that are in my dataset currently.
> >
> > Regards,
> > Benedict
> >
>

Re: MovieALS Implicit Error

Posted by Sean Owen <so...@cloudera.com>.
Is the data set synthetic, or has very few items? or is indeed very
sparse? those could be reasons. However usually this kind of thing
happens with very small data sets. I could be wrong about what's going
on, but it's a decent guess at the immediate cause given the error
messages.

On Mon, Jul 13, 2015 at 12:12 PM, Benedict Liang
<bl...@thecarousell.com> wrote:
> Hi Sean,
>
> Thank you for your quick response. By very little data, do you mean that the
> matrix is too sparse? Or are there too little data points? There are 3856988
> ratings that are in my dataset currently.
>
> Regards,
> Benedict
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: MovieALS Implicit Error

Posted by Benedict Liang <bl...@thecarousell.com>.
Hi Sean,

Thank you for your quick response. By very little data, do you mean that
the matrix is too sparse? Or are there too little data points? There
are 3856988
ratings that are in my dataset currently.

Regards,
Benedict



On Mon, Jul 13, 2015 at 7:07 PM, Sean Owen <so...@cloudera.com> wrote:

> I interpret this to mean that the input to the Cholesky decomposition
> wasn't positive definite. I think this can happen if the input matrix
> is singular or very near singular -- maybe, very little data? Ben that
> might at least address why this is happening; different input may work
> fine.
>
> Xiangrui I think we might have discussed this a while ago but I am not
> sure positive definite is a good assumption here, so I don't know that
> Cholesky can be used reliably. I have always used the QR decomposition
> for this reason. Then again there is always this 10% chance I'm
> missing a subtlety there.
>
>
>
> On Mon, Jul 13, 2015 at 11:55 AM, bliang <bl...@thecarousell.com> wrote:
> > Hi, I am trying to run the MovieALS example with an implicit dataset and
> am
> > receiving this error:
> >
> > Got 3856988 ratings from 144250 users on 378937 movies.
> > Training: 3085522, test: 771466.
> > 15/07/13 10:43:07 WARN BLAS: Failed to load implementation from:
> > com.github.fommil.netlib.NativeSystemBLAS
> > 15/07/13 10:43:07 WARN BLAS: Failed to load implementation from:
> > com.github.fommil.netlib.NativeRefBLAS
> > 15/07/13 10:43:10 WARN TaskSetManager: Lost task 3.0 in stage 29.0 (TID
> 192,
> > 10.162.45.33): java.lang.AssertionError: assertion failed: lapack.dppsv
> > returned 1.
> >       at scala.Predef$.assert(Predef.scala:179)
> >       at
> >
> org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
> >       at
> >
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
> >       at
> >
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
> >       at
> >
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> >       at
> >
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> >       at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> >       at
> org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)
> >       at
> org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
> >       at
> org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
> >       at org.apache.spark.rdd.RDD.iterator(RDD.scala:242)
> >       at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
> >       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
> >       at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
> >       at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
> >       at org.apache.spark.scheduler.Task.run(Task.scala:70)
> >       at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
> >       at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >       at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >       at java.lang.Thread.run(Thread.java:745)
> >
> > 15/07/13 10:43:10 ERROR TaskSetManager: Task 12 in stage 29.0 failed 4
> > times; aborting job
> > Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due
> > to stage failure: Task 12 in stage 29.0 failed 4 times, most recent
> failure:
> > Lost task 12.3 in stage 29.0 (TID 249, 10.162.45.33):
> > java.lang.AssertionError: assertion failed: lapack.dppsv returned 1.
> >       at scala.Predef$.assert(Predef.scala:179)
> >       at
> >
> org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
> >       at
> >
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
> >       at
> >
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
> >       at
> >
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> >       at
> >
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> >       at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> >       at
> org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)
> >       at
> org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
> >       at
> org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
> >       at org.apache.spark.rdd.RDD.iterator(RDD.scala:242)
> >       at
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
> >       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
> >       at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
> >       at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
> >       at org.apache.spark.scheduler.Task.run(Task.scala:70)
> >       at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
> >       at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >       at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >       at java.lang.Thread.run(Thread.java:745)
> >
> > Driver stacktrace:
> >       at
> > org.apache.spark.scheduler.DAGScheduler.org
> $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266)
> >       at
> >
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257)
> >       at
> >
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256)
> >       at
> >
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> >       at
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> >       at
> >
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256)
> >       at
> >
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
> >       at
> >
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
> >       at scala.Option.foreach(Option.scala:236)
> >       at
> >
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
> >       at
> >
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450)
> >       at
> >
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
> >       at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
> >
> > Would it be possible to help me out? Thank you, Ben
> > ________________________________
> > View this message in context: MovieALS Implicit Error
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Re: MovieALS Implicit Error

Posted by Sean Owen <so...@cloudera.com>.
I interpret this to mean that the input to the Cholesky decomposition
wasn't positive definite. I think this can happen if the input matrix
is singular or very near singular -- maybe, very little data? Ben that
might at least address why this is happening; different input may work
fine.

Xiangrui I think we might have discussed this a while ago but I am not
sure positive definite is a good assumption here, so I don't know that
Cholesky can be used reliably. I have always used the QR decomposition
for this reason. Then again there is always this 10% chance I'm
missing a subtlety there.



On Mon, Jul 13, 2015 at 11:55 AM, bliang <bl...@thecarousell.com> wrote:
> Hi, I am trying to run the MovieALS example with an implicit dataset and am
> receiving this error:
>
> Got 3856988 ratings from 144250 users on 378937 movies.
> Training: 3085522, test: 771466.
> 15/07/13 10:43:07 WARN BLAS: Failed to load implementation from:
> com.github.fommil.netlib.NativeSystemBLAS
> 15/07/13 10:43:07 WARN BLAS: Failed to load implementation from:
> com.github.fommil.netlib.NativeRefBLAS
> 15/07/13 10:43:10 WARN TaskSetManager: Lost task 3.0 in stage 29.0 (TID 192,
> 10.162.45.33): java.lang.AssertionError: assertion failed: lapack.dppsv
> returned 1.
> 	at scala.Predef$.assert(Predef.scala:179)
> 	at
> org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
> 	at
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
> 	at
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
> 	at
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> 	at
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> 	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> 	at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)
> 	at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
> 	at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
> 	at org.apache.spark.rdd.RDD.iterator(RDD.scala:242)
> 	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
> 	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
> 	at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:70)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
> 	at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
>
> 15/07/13 10:43:10 ERROR TaskSetManager: Task 12 in stage 29.0 failed 4
> times; aborting job
> Exception in thread "main" org.apache.spark.SparkException: Job aborted due
> to stage failure: Task 12 in stage 29.0 failed 4 times, most recent failure:
> Lost task 12.3 in stage 29.0 (TID 249, 10.162.45.33):
> java.lang.AssertionError: assertion failed: lapack.dppsv returned 1.
> 	at scala.Predef$.assert(Predef.scala:179)
> 	at
> org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
> 	at
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
> 	at
> org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
> 	at
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> 	at
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
> 	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> 	at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)
> 	at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)
> 	at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)
> 	at org.apache.spark.rdd.RDD.iterator(RDD.scala:242)
> 	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
> 	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)
> 	at org.apache.spark.rdd.RDD.iterator(RDD.scala:244)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:70)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
> 	at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
>
> Driver stacktrace:
> 	at
> org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266)
> 	at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257)
> 	at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256)
> 	at
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> 	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> 	at
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256)
> 	at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
> 	at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
> 	at scala.Option.foreach(Option.scala:236)
> 	at
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
> 	at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450)
> 	at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
> 	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>
> Would it be possible to help me out? Thank you, Ben
> ________________________________
> View this message in context: MovieALS Implicit Error
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org