You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/07/12 08:32:20 UTC

[jira] [Assigned] (SPARK-16435) Behavior changes if initialExecutor is less than minExecutor for dynamic allocation

     [ https://issues.apache.org/jira/browse/SPARK-16435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-16435:
------------------------------------

    Assignee: Apache Spark

> Behavior changes if initialExecutor is less than minExecutor for dynamic allocation
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-16435
>                 URL: https://issues.apache.org/jira/browse/SPARK-16435
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler, Spark Core
>    Affects Versions: 2.0.0
>            Reporter: Saisai Shao
>            Assignee: Apache Spark
>            Priority: Minor
>
> After SPARK-13723, the behavior changed for {{spark.dynamicAllocation.initialExecutors}} less then {{spark.dynamicAllocation.minExecutors}} situation.
> initialExecutors < minExecutors is an invalid setting,
> h4. Before SPARK-13723
> If initialExecutors < minExecutors, Spark will throw exception with:
> {code}
> java.lang.IllegalArgumentException: requirement failed: initial executor number xxx must between min executor number xxx and max executor number xxx
> {code}
> This will clearly let user know that current configuration is invalid.
> h4. After SPARK-13723
> Because we also consider {{spark.executor.instances}}, so the initial number is the max value between minExecutors, initialExecutors, numExecutors.
> This will silently ignore the situation where initialExecutors < minExecutors.
> So at least we should add some warning logs to let user know this is an invalid configuration.
> What do you think [~tgraves], [~rdblue]?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org