You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/05/02 04:28:57 UTC

[GitHub] [spark] ajithme commented on a change in pull request #24438: [SPARK-23626][CORE] DAGScheduler blocked due to JobSubmitted event

ajithme commented on a change in pull request #24438: [SPARK-23626][CORE] DAGScheduler blocked due to JobSubmitted event
URL: https://github.com/apache/spark/pull/24438#discussion_r280286559
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
 ##########
 @@ -704,12 +704,31 @@ private[spark] class DAGScheduler(
     assert(partitions.nonEmpty)
     val func2 = func.asInstanceOf[(TaskContext, Iterator[_]) => _]
     val waiter = new JobWaiter[U](this, jobId, partitions.size, resultHandler)
+    eagerPartitions(rdd)
     eventProcessLoop.post(JobSubmitted(
       jobId, rdd, func2, partitions.toArray, callSite, waiter,
       SerializationUtils.clone(properties)))
     waiter
   }
 
+  /**
+   * Responsible for eager evaluation of all dependency partitions.
+   * Takes effect only if <b>spark.rdd.eager.partitions</b> is true
+   * @param rdd : initial rdd to be evaluated
+   */
+  def eagerPartitions(rdd: RDD[_]): Unit = {
+    if (sc.getConf.get(config.EAGER_RDD_PARTITION_EVALUATE)) {
+      try {
+        rdd.dependencies.foreach { dependency =>
 
 Review comment:
   i checked that all the caller methods were already doing ``rdd.partitions`` hence i skipped calling again, but yes, agree with your suggestion, will change ``eagerPartitions`` to call ``rdd.partitions`` for completeness

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org