You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Balagopal Nair (JIRA)" <ji...@apache.org> on 2015/09/17 03:51:45 UTC

[jira] [Comment Edited] (SPARK-10644) Applications wait even if free executors are available

    [ https://issues.apache.org/jira/browse/SPARK-10644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14791434#comment-14791434 ] 

Balagopal Nair edited comment on SPARK-10644 at 9/17/15 1:51 AM:
-----------------------------------------------------------------

No. These are independent jobs running under different SparkContexts.
Sorry about not being clear enough before... I'm trying share the same cluster between varrious applications. This issue is related to scheduling across applications and not within the same application.


was (Author: nbalagopal):
No. These are independent jobs running under different SparkContexts

> Applications wait even if free executors are available
> ------------------------------------------------------
>
>                 Key: SPARK-10644
>                 URL: https://issues.apache.org/jira/browse/SPARK-10644
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler
>    Affects Versions: 1.5.0
>         Environment: RHEL 6.5 64 bit
>            Reporter: Balagopal Nair
>
> Number of workers: 21
> Number of executors: 63
> Steps to reproduce:
> 1. Run 4 jobs each with max cores set to 10
> 2. The first 3 jobs run with 10 each. (30 executors consumed so far)
> 3. The 4 th job waits even though there are 33 idle executors.
> The reason is that a job will not get executors unless 
> the total number of EXECUTORS in use < the number of WORKERS
> If there are executors available, resources should be allocated to the pending job.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org