You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "SaintBacchus (JIRA)" <ji...@apache.org> on 2015/06/05 03:35:38 UTC

[jira] [Updated] (SPARK-8119) Spark will set total executor when some executors fail.

     [ https://issues.apache.org/jira/browse/SPARK-8119?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

SaintBacchus updated SPARK-8119:
--------------------------------
    Description: 
DynamicAllocation will set the total executor to a little number when it wants to kill some executors.
But in no-DynamicAllocation scenario, Spark will also set the total executor.
So it will cause such problem: sometimes an executor fails down, there is no more executor which will be pull up by spark.

  was:
DynamicAllocation will set the total executor to a little number when it wants to kill some executors.
But in no-DynamicAllocation scenario, Spark will also set the total executor. So it will cause thus problem: sometimes an executor fails down, there is no more executor which will be pull up by spark.


> Spark will set total executor when some executors fail.
> -------------------------------------------------------
>
>                 Key: SPARK-8119
>                 URL: https://issues.apache.org/jira/browse/SPARK-8119
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler
>    Affects Versions: 1.4.0
>            Reporter: SaintBacchus
>             Fix For: 1.4.0
>
>
> DynamicAllocation will set the total executor to a little number when it wants to kill some executors.
> But in no-DynamicAllocation scenario, Spark will also set the total executor.
> So it will cause such problem: sometimes an executor fails down, there is no more executor which will be pull up by spark.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org