You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Fei Hu <hu...@gmail.com> on 2016/12/31 01:55:10 UTC

context.runJob() was suspended in getPreferredLocations() function

Dear all,

I tried to customize my own RDD. In the getPreferredLocations() function, I
used the following code to query anonter RDD, which was used as an input to
initialize this customized RDD:

                   * val results: Array[Array[DataChunkPartition]] =
context.runJob(partitionsRDD, (context: TaskContext, partIter:
Iterator[DataChunkPartition]) => partIter.toArray, partitions, allowLocal =
true)*

The problem is that when executing the above code, the task seemed to be
suspended. I mean the job just stopped at this code, but no errors and no
outputs.

What is the reason for it?

Thanks,
Fei

Re: context.runJob() was suspended in getPreferredLocations() function

Posted by Liang-Chi Hsieh <vi...@gmail.com>.
Hi,

Simply said, you submit another Job in the event thread which will be
blocked and can't receive the this job submission event. So your second job
submission is never processed, and the getPreferredLocations method is never
returned.



Fei Hu wrote
> Dear all,
> 
> I tried to customize my own RDD. In the getPreferredLocations() function,
> I
> used the following code to query anonter RDD, which was used as an input
> to
> initialize this customized RDD:
> 
>                    * val results: Array[Array[DataChunkPartition]] =
> context.runJob(partitionsRDD, (context: TaskContext, partIter:
> Iterator[DataChunkPartition]) => partIter.toArray, partitions, allowLocal
> =
> true)*
> 
> The problem is that when executing the above code, the task seemed to be
> suspended. I mean the job just stopped at this code, but no errors and no
> outputs.
> 
> What is the reason for it?
> 
> Thanks,
> Fei





-----
Liang-Chi Hsieh | @viirya 
Spark Technology Center 
http://www.spark.tc/ 
--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/context-runJob-was-suspended-in-getPreferredLocations-function-tp20412p20419.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org