You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2014/09/17 22:10:39 UTC

[jira] [Commented] (SPARK-3571) Spark standalone cluster mode doesn't work.

    [ https://issues.apache.org/jira/browse/SPARK-3571?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14137884#comment-14137884 ] 

Apache Spark commented on SPARK-3571:
-------------------------------------

User 'sarutak' has created a pull request for this issue:
https://github.com/apache/spark/pull/2436

> Spark standalone cluster mode doesn't work.
> -------------------------------------------
>
>                 Key: SPARK-3571
>                 URL: https://issues.apache.org/jira/browse/SPARK-3571
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.0
>            Reporter: Kousuke Saruta
>            Priority: Blocker
>
> Recent changes of Master.scala causes Spark standalone cluster mode not working.
> I think, the loop in Master#schedule never assign worker for driver.
> {code}
>     for (driver <- waitingDrivers.toList) { // iterate over a copy of waitingDrivers
>       // We assign workers to each waiting driver in a round-robin fashion. For each driver, we
>       // start from the last worker that was assigned a driver, and continue onwards until we have
>       // explored all alive workers.
>       curPos = (curPos + 1) % aliveWorkerNum
>       val startPos = curPos
>       var launched = false
>       while (curPos != startPos && !launched) {
>         val worker = shuffledAliveWorkers(curPos)
>         if (worker.memoryFree >= driver.desc.mem && worker.coresFree >= driver.desc.cores) {
>           launchDriver(worker, driver)
>           waitingDrivers -= driver
>           launched = true
>         }
>         curPos = (curPos + 1) % aliveWorkerNum
>       }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org