You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nick Pentreath <ni...@gmail.com> on 2014/01/16 13:37:37 UTC

Expect only DirectTaskResults when using LocalScheduler

This has me puzzled.

I'm using 0.8.1-incubating, and trying to run a pretty simple mapValues on
an RDD that is the result of computing an MLlib ALS model (so it is RDD[(Int,
Array[Double])] )

I get the following failure which I've never come across before.

org.apache.spark.SparkException (org.apache.spark.SparkException: Expect
only DirectTaskResults when using LocalScheduler)

org.apache.spark.scheduler.local.LocalTaskSetManager.taskEnded(LocalTaskSetManager.scala:147)
org.apache.spark.scheduler.local.LocalScheduler.statusUpdate(LocalScheduler.scala:200)
org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:252)
org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:50)
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
java.lang.Thread.run(Thread.java:724)