You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by JoshRosen <gi...@git.apache.org> on 2015/02/03 03:21:43 UTC

[GitHub] spark pull request: [SPARK-4939] revive offers periodically in Loc...

Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4147#discussion_r23978147
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/local/LocalBackend.scala ---
    @@ -74,10 +77,20 @@ private[spark] class LocalActor(
     
       def reviveOffers() {
         val offers = Seq(new WorkerOffer(localExecutorId, localExecutorHostname, freeCores))
    -    for (task <- scheduler.resourceOffers(offers).flatten) {
    -      freeCores -= scheduler.CPUS_PER_TASK
    -      executor.launchTask(executorBackend, taskId = task.taskId, attemptNumber = task.attemptNumber,
    -        task.name, task.serializedTask)
    +    val tasks = scheduler.resourceOffers(offers).flatten
    +    if (tasks.nonEmpty) {
    +      for (task <- tasks) {
    +        freeCores -= scheduler.CPUS_PER_TASK
    +        executor.launchTask(executorBackend, taskId = task.taskId,
    +          attemptNumber = task.attemptNumber, task.name, task.serializedTask)
    +      }
    +    } else if (scheduler.activeTaskSets.nonEmpty) {
    +      // Try to reviveOffer after 1 second, because scheduler may wait for locality timeout
    +      timer.schedule(new TimerTask {
    +        override def run(): Unit = {
    +          reviveOffers()
    --- End diff --
    
    For thread-safety reasons, I think we should have this send a `ReviveOffers` message to the actor rather than calling `reviveOffers` directly.  Akka actually has a pretty nice way to handle this type of pattern by scheduling messages to be sent after a delay; this uses the ActorSystem's scheduler, so we wouldn't have to create our own timer.  Here's an example usage of `scheduleOnce` for this, from `Master.scala`: https://github.com/apache/spark/blob/v1.2.1-rc3/core/src/main/scala/org/apache/spark/deploy/master/Master.scala#L186


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org