You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by liu-zhaokun <gi...@git.apache.org> on 2017/12/07 09:16:20 UTC

[GitHub] spark pull request #19919: [SPARK-22727] spark.executor.instances's default ...

GitHub user liu-zhaokun opened a pull request:

    https://github.com/apache/spark/pull/19919

    [SPARK-22727] spark.executor.instances's default value should be 2

    [https://issues.apache.org/jira/browse/SPARK-22727](https://issues.apache.org/jira/browse/SPARK-22727)
    ## What changes were proposed in this pull request?
    
    when I run a application on yarn,I don't set the value of spark.executor.instances,so I think it's default value should as same as the running-on-yarn.md said is 2.But the log of driver logs "spark.executor.instances less than spark.dynamicAllocation.minExecutors is invalid, ignoring its setting, please update your configs.",so I know the default value of this configuration isn't 2.So I think we should fix it.
    
    ## How was this patch tested?
    
    (Please explain how this patch was tested. E.g. unit tests, integration tests, manual tests)
    (If this patch involves UI changes, please attach a screenshot; otherwise, remove this)
    
    Please review http://spark.apache.org/contributing.html before opening a pull request.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/liu-zhaokun/spark master1207

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19919.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19919
    
----
commit 9c3cd44c0cf0501023a8adab262a14f257acf012
Author: liuzhaokun <li...@zte.com.cn>
Date:   2017-12-07T09:10:29Z

    [SPARK-22727] spark.executor.instances's default value should be 2

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    `ApplicationMasterArguments.DEFAULT_NUMBER_EXECUTORS`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    If the default is correctly described as 2, then I think there is nothing to do here and this should be closed. This change will cause other problems, at least.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by liu-zhaokun <gi...@git.apache.org>.
Github user liu-zhaokun commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    @srowen 
    Where does the default value "2" of  spark.executor.instances used?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by liu-zhaokun <gi...@git.apache.org>.
Github user liu-zhaokun commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    @srowen 
    If the default is not correct,I can fix it in this PR by the way.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    No, this change breaks the existing logic. These instances do not actually control the default number of instances, but instead are used to detect whether the value was set at all. With this change, for example, you'd now get an error if you set min executors to 0 but don't set the number of executor instances (which you wouldn't normally).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    Also `YarnSparkHadoopUtil.DEFAULT_NUMBER_EXECUTORS` which actually seems to be used.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    Hm, I thought the default would appear in `.../config/package.scala`, but I don't see it. I'm not actually sure it defaults to 2 anywhere (?) . (@vanzin does this ring a bell?)  I don't think these changes should be made, but I'm also wondering if the docs are correct that the default is 2.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #19919: [SPARK-22727] spark.executor.instances's default value s...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/19919
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #19919: [SPARK-22727] spark.executor.instances's default ...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/19919


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org