You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jongyoul Lee (JIRA)" <ji...@apache.org> on 2014/11/21 00:52:35 UTC

[jira] [Created] (SPARK-4525) MesosSchedulerBackend.resourceOffers cannot decline unused offers from acceptedOffers

Jongyoul Lee created SPARK-4525:
-----------------------------------

             Summary: MesosSchedulerBackend.resourceOffers cannot decline unused offers from acceptedOffers
                 Key: SPARK-4525
                 URL: https://issues.apache.org/jira/browse/SPARK-4525
             Project: Spark
          Issue Type: Bug
          Components: Mesos
    Affects Versions: 1.2.0, 1.3.0
            Reporter: Jongyoul Lee
            Priority: Blocker
             Fix For: 1.2.0, 1.3.0


After resourceOffers function is refactored - SPARK-2269 -, that function doesn't decline unused offers from accepted offers. That's because when driver.launchTasks is called, if that's tasks is empty, driver.launchTask calls the declineOffer(offer.id). 
{quote}
Invoking this function with an empty collection of tasks declines offers in their entirety (see SchedulerDriver.declineOffer(OfferID, Filters)).
- http://mesos.apache.org/api/latest/java/org/apache/mesos/MesosSchedulerDriver.html#launchTasks(OfferID,%20java.util.Collection,%20Filters)
{quote}

In branch-1.1, resourcesOffers calls a launchTask function for all offered offers, so driver declines unused resources, however, in current master, at first offers are divided accepted and declined offers by their resources, and delinedOffers are declined explicitly, and offers with task from acceptedOffers are launched by driver.launchTasks, but, offers without from acceptedOfers are not launched with empty task or declined explicitly. Thus, mesos master judges thats offers used by TaskScheduler and there are no resources remaing.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org