You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Bharath Ravi Kumar <re...@gmail.com> on 2014/06/27 06:17:54 UTC

NPE calling reduceByKey on JavaPairRDD

Hi,

I've been encountering a NPE invoking reduceByKey on JavaPairRDD since
upgrading to 1.0.0 . The issue is straightforward to reproduce with 1.0.0
and doesn't occur with 0.9.0.  The stack trace is as follows:

14/06/26 21:05:35 WARN scheduler.TaskSetManager: Loss was due to
java.lang.NullPointerException
java.lang.NullPointerException
at
org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:750)
at
org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:750)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at org.apache.spark.Aggregator.combineValuesByKey(Aggregator.scala:59)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$1.apply(PairRDDFunctions.scala:96)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$1.apply(PairRDDFunctions.scala:95)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.Task.run(Task.scala:51)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)


I've raised a bug to track this issue :
https://issues.apache.org/jira/browse/SPARK-2292

Thanks,
Bharath

Re: NPE calling reduceByKey on JavaPairRDD

Posted by Reynold Xin <rx...@databricks.com>.
Responded on the jira...




On Thu, Jun 26, 2014 at 9:17 PM, Bharath Ravi Kumar <re...@gmail.com>
wrote:

> Hi,
>
> I've been encountering a NPE invoking reduceByKey on JavaPairRDD since
> upgrading to 1.0.0 . The issue is straightforward to reproduce with 1.0.0
> and doesn't occur with 0.9.0.  The stack trace is as follows:
>
> 14/06/26 21:05:35 WARN scheduler.TaskSetManager: Loss was due to
> java.lang.NullPointerException
> java.lang.NullPointerException
> at
>
> org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:750)
> at
>
> org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:750)
> at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> at org.apache.spark.Aggregator.combineValuesByKey(Aggregator.scala:59)
> at
>
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$1.apply(PairRDDFunctions.scala:96)
> at
>
> org.apache.spark.rdd.PairRDDFunctions$$anonfun$1.apply(PairRDDFunctions.scala:95)
> at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
> at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:582)
> at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
> at
> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
> at org.apache.spark.scheduler.Task.run(Task.scala:51)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
> at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
> at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
> at java.lang.Thread.run(Thread.java:722)
>
>
> I've raised a bug to track this issue :
> https://issues.apache.org/jira/browse/SPARK-2292
>
> Thanks,
> Bharath
>