You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by dhruve <gi...@git.apache.org> on 2018/09/11 19:36:24 UTC

[GitHub] spark pull request #22288: [SPARK-22148][SPARK-15815][Scheduler] Acquire new...

Github user dhruve commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22288#discussion_r216795021
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
    @@ -414,9 +425,48 @@ private[spark] class TaskSchedulerImpl(
                 launchedAnyTask |= launchedTaskAtCurrentMaxLocality
               } while (launchedTaskAtCurrentMaxLocality)
             }
    +
             if (!launchedAnyTask) {
    -          taskSet.abortIfCompletelyBlacklisted(hostToExecutors)
    -        }
    +          taskSet.getCompletelyBlacklistedTaskIfAny(hostToExecutors) match {
    +            case taskIndex: Some[Int] => // Returns the taskIndex which was unschedulable
    +
    +              // If the taskSet is unschedulable we kill an existing blacklisted executor/s and
    +              // kick off an abortTimer which after waiting will abort the taskSet if we were
    +              // unable to schedule any task from the taskSet.
    +              // Note: We keep a track of schedulability on a per taskSet basis rather than on a
    +              // per task basis.
    +              val executor = hostToExecutors.valuesIterator.next().iterator.next()
    --- End diff --
    
    That's a nice suggestion. 
    
    There was a case where you could have a few executors running, let's say just 3 of them and all are blacklisted but have some tasks running on them. To satisfy this,  I had started modifying this to take down an executor with the least no. of tasks running on them. I'll check some more on this.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org