You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by heary-cao <gi...@git.apache.org> on 2017/07/06 13:13:13 UTC

[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

GitHub user heary-cao opened a pull request:

    https://github.com/apache/spark/pull/18555

    [Minor]add checkValue in spark.internal.config about how to correctly set configurations

    
    ## What changes were proposed in this pull request?
    
    add checkValue for configurations `spark.driver.memory`
    add checkValue for configurations `spark.executor.memory`
    add checkValue for configurations `spark.blockManager.port`
    
    ## How was this patch tested?
    existing test.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/heary-cao/spark driver_host

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/18555.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #18555
    
----
commit fc4f269328440c584ef5adde491463414cbf1fea
Author: caoxuewen <ca...@zte.com.cn>
Date:   2017-07-06T13:06:15Z

    add checkValue in spark.internal.config about how to correctly set configurations

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79602/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79635 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79635/testReport)** for PR 18555 at commit [`5b2c9cc`](https://github.com/apache/spark/commit/5b2c9cc9464b00a0e9925b7bc06e16389bbcfc7c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #87053 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87053/testReport)** for PR 18555 at commit [`a0efb41`](https://github.com/apache/spark/commit/a0efb41fe041520bf423933b3c7b18ad2feb6711).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315664
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    --- End diff --
    
    check value with `_ >= 0`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79602 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79602/testReport)** for PR 18555 at commit [`fcb9329`](https://github.com/apache/spark/commit/fcb9329220b51a43dddf8ac6a550295c8891d23b).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79364 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79364/testReport)** for PR 18555 at commit [`6512814`](https://github.com/apache/spark/commit/651281417ee180263e2d95316327691551af10ab).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126273760
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,22 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    --- End diff --
    
    thanks.
    please review it again.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Please open a JIRA for this PR


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79637 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79637/testReport)** for PR 18555 at commit [`17afdd7`](https://github.com/apache/spark/commit/17afdd7a2f5fa4333bd243cad78265f28eb1f62d).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315869
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -306,19 +396,23 @@ package object config {
             "record the size accurately if it's above this config. This helps to prevent OOM by " +
             "avoiding underestimating shuffle block size when fetch shuffle blocks.")
           .bytesConf(ByteUnit.BYTE)
    +      .checkValue(_ > 0, "The number of BlockThreshold event queue must not be negative")
           .createWithDefault(100 * 1024 * 1024)
     
       private[spark] val SHUFFLE_REGISTRATION_TIMEOUT =
         ConfigBuilder("spark.shuffle.registration.timeout")
           .doc("Timeout in milliseconds for registration to the external shuffle service.")
           .timeConf(TimeUnit.MILLISECONDS)
    +      .checkValue(_ > 0,
    +        "Timeout in milliseconds for registration event queue must not be negative")
    --- End diff --
    
    ditto. `positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126273756
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,22 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 1g, 2g). ")
    --- End diff --
    
    OK.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by jiangxb1987 <gi...@git.apache.org>.
Github user jiangxb1987 commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    We should also check for duplicated config check value logics in the code, but I think these could be addressed in a follow-up PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79634/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #87053 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87053/testReport)** for PR 18555 at commit [`a0efb41`](https://github.com/apache/spark/commit/a0efb41fe041520bf423933b3c7b18ad2feb6711).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79361 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79361/testReport)** for PR 18555 at commit [`b19b378`](https://github.com/apache/spark/commit/b19b378ed72bfa910e889f23bc4f848324445d08).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274203
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -88,50 +137,76 @@ package object config {
         .createOptional
     
       private[spark] val PY_FILES = ConfigBuilder("spark.submit.pyFiles")
    +    .doc("Comma-separated list of .zip, .egg, or .py files to place on " +
    +      "the PYTHONPATH for Python apps.")
         .internal()
         .stringConf
         .toSequence
         .createWithDefault(Nil)
     
       private[spark] val MAX_TASK_FAILURES =
         ConfigBuilder("spark.task.maxFailures")
    +      .doc("Number of failures of any particular task before giving up on the job. " +
    +        "The total number of failures spread across different tasks " +
    +        "will not cause the job to fail; " +
    +        "a particular task has to fail this number of attempts. " +
    +        "Should be greater than or equal to 1. Number of allowed retries = this value - 1.")
           .intConf
    +      .checkValue(_ > 0, "The retry times of task should be greater than or equal to 1")
           .createWithDefault(4)
     
       // Blacklist confs
       private[spark] val BLACKLIST_ENABLED =
         ConfigBuilder("spark.blacklist.enabled")
    +      .doc("If set to 'true', prevent Spark from scheduling tasks on executors " +
    +        "that have been blacklisted due to too many task failures. " +
    +        "The blacklisting algorithm can be further controlled by " +
    +        "the other 'spark.blacklist' configuration options. ")
           .booleanConf
           .createOptional
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_EXECUTOR =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerExecutor")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one executor before the executor is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(1)
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_NODE =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerNode")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one node, before the entire node is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILURES_PER_EXEC =
         ConfigBuilder("spark.blacklist.application.maxFailedTasksPerExecutor")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILURES_PER_EXEC_STAGE =
         ConfigBuilder("spark.blacklist.stage.maxFailedTasksPerExecutor")
    +      .doc("How many different tasks must fail on one executor, " +
    +        "within one stage, before the executor is blacklisted for that stage.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79926 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79926/testReport)** for PR 18555 at commit [`26afb05`](https://github.com/apache/spark/commit/26afb05a78447590144def42a54d9f88095c78a5).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    ok to test


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79932 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79932/testReport)** for PR 18555 at commit [`77a7ca4`](https://github.com/apache/spark/commit/77a7ca41aa08578935727bdfa7f24e519b6b73ad).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79429 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79429/testReport)** for PR 18555 at commit [`d550a68`](https://github.com/apache/spark/commit/d550a688d2213db42833ef7540270d9c760a2024).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #87048 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87048/testReport)** for PR 18555 at commit [`c213be6`](https://github.com/apache/spark/commit/c213be69ae545cc360a9480969c1aa02d1e51e39).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #80117 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80117/testReport)** for PR 18555 at commit [`92d86b2`](https://github.com/apache/spark/commit/92d86b2a216ff9eddb1dc31c7c89472e1e0298fb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79428/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79549 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79549/testReport)** for PR 18555 at commit [`dd066b6`](https://github.com/apache/spark/commit/dd066b60bcb4257d3825a47840b68b9b55f4131a).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126273196
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,22 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    --- End diff --
    
    boolean conf doesn't need `checkValue`...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79927 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79927/testReport)** for PR 18555 at commit [`364cfc4`](https://github.com/apache/spark/commit/364cfc4d6c8efc2a260d2a7cd578d9099851e3d2).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #84786 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/84786/testReport)** for PR 18555 at commit [`3135235`](https://github.com/apache/spark/commit/31352352310eb36a9bc47d346781b02288795be1).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79635/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126329325
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    --- End diff --
    
    Also, it would be great if you check the error message of the exception.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by jiangxb1987 <gi...@git.apache.org>.
Github user jiangxb1987 commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Please also update the PR description. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126293363
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,21 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 512, 1g, 2g). ")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The driver memory must be greater than 0MB " +
    --- End diff --
    
    Is it more accurate to use MiB instead of MB here? also please add WS like `0 MiB`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/84786/
    Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79609 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79609/testReport)** for PR 18555 at commit [`aa6c621`](https://github.com/apache/spark/commit/aa6c62122ba3091695d19c8915361e3b8a24f363).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    cc @cloud-fan 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79932 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79932/testReport)** for PR 18555 at commit [`77a7ca4`](https://github.com/apache/spark/commit/77a7ca41aa08578935727bdfa7f24e519b6b73ad).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274186
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +63,66 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .checkValues(Set(false, true))
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(v => v > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    +        "Lower bound for the number of executors should be greater than or equal to 0")
    +      .createWithDefault(0)
     
       private[spark] val DYN_ALLOCATION_INITIAL_EXECUTORS =
         ConfigBuilder("spark.dynamicAllocation.initialExecutors")
    +      .doc("Initial number of executors to run if dynamic allocation is enabled. " +
    +        "If `--num-executors` (or `spark.executor.instances`) is set " +
    +        "and larger than this value, " +
    +        "it will be used as the initial number of executors.")
           .fallbackConf(DYN_ALLOCATION_MIN_EXECUTORS)
     
       private[spark] val DYN_ALLOCATION_MAX_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.maxExecutors").intConf.createWithDefault(Int.MaxValue)
    +    ConfigBuilder("spark.dynamicAllocation.maxExecutors")
    +      .doc("Upper bound for the number of executors if dynamic allocation is enabled. ")
    +      .intConf
    +      .createWithDefault(Int.MaxValue)
     
       private[spark] val SHUFFLE_SERVICE_ENABLED =
    -    ConfigBuilder("spark.shuffle.service.enabled").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.shuffle.service.enabled")
    +      .doc("Enables the external shuffle service. This service preserves " +
    +        "the shuffle files written by executors so the executors can be safely removed. " +
    +        "This must be enabled if spark.dynamicAllocation.enabled is 'true'. " +
    +        "The external shuffle service must be set up in order to enable it. " +
    +        "See dynamic allocation configuration and setup documentation for more information. ")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315834
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    --- End diff --
    
    Since `_ > 0`, we should avoid 0, too.
    `must not be negative` => `must be positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #87024 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87024/testReport)** for PR 18555 at commit [`c213be6`](https://github.com/apache/spark/commit/c213be69ae545cc360a9480969c1aa02d1e51e39).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/86961/
    Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79460 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79460/testReport)** for PR 18555 at commit [`51f0cec`](https://github.com/apache/spark/commit/51f0cecebc950a5e61fb9a9290093982b4bbc1b1).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    First, thank you for working on it!
    
    For any extra checking, we need a negative test case to cover it. 
    
    We have to be very careful to ensure we do not break the existing applications. For each change, we need to answer whether the `so-called illegal` values still properly work without this PR. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79364 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79364/testReport)** for PR 18555 at commit [`6512814`](https://github.com/apache/spark/commit/651281417ee180263e2d95316327691551af10ab).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r166171819
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -231,6 +315,9 @@ package object config {
       private[spark] val BLOCK_MANAGER_PORT = ConfigBuilder("spark.blockManager.port")
         .doc("Port to use for the block manager when a more specific setting is not provided.")
         .intConf
    +    .checkValue(v => v == 0 || (1024 <= v && v < 65536),
    --- End diff --
    
    well, thanks!


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #86961 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86961/testReport)** for PR 18555 at commit [`cac5c03`](https://github.com/apache/spark/commit/cac5c030919660bcfbf98841f7da159644d83f58).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile @cloud-fan 
    please review it again.
    thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79428 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79428/testReport)** for PR 18555 at commit [`9f924e5`](https://github.com/apache/spark/commit/9f924e53e83a3edd9413d4efc4a73b6053b0d6df).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79429 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79429/testReport)** for PR 18555 at commit [`d550a68`](https://github.com/apache/spark/commit/d550a688d2213db42833ef7540270d9c760a2024).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126273242
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,22 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 1g, 2g). ")
    --- End diff --
    
    we should also mention that, if no unit is given, e.g. `500`, it means 500mb. We also need to update `configuration.md` for this conf.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126293557
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -231,6 +315,9 @@ package object config {
       private[spark] val BLOCK_MANAGER_PORT = ConfigBuilder("spark.blockManager.port")
         .doc("Port to use for the block manager when a more specific setting is not provided.")
         .intConf
    +    .checkValue(v => v == 0 || (1024 <= v && v < 65536),
    --- End diff --
    
    Maybe we should let user to decide whether to bind to a privileged port. Taking an assumption here may potentially break some use cases.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126090693
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -38,7 +38,12 @@ package object config {
         ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    --- End diff --
    
    OK,
    please review it again.
    thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87048/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79650 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79650/testReport)** for PR 18555 at commit [`722f26e`](https://github.com/apache/spark/commit/722f26e55e53dac263f3a0bce899d9881d9e01dd).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79421 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79421/testReport)** for PR 18555 at commit [`5040998`](https://github.com/apache/spark/commit/5040998db9aacb2dd684c1893f3312adbe4033a4).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79421 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79421/testReport)** for PR 18555 at commit [`5040998`](https://github.com/apache/spark/commit/5040998db9aacb2dd684c1893f3312adbe4033a4).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79933/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126308305
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +63,66 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .checkValues(Set(false, true))
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(v => v > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    +        "Lower bound for the number of executors should be greater than or equal to 0")
    +      .createWithDefault(0)
     
       private[spark] val DYN_ALLOCATION_INITIAL_EXECUTORS =
         ConfigBuilder("spark.dynamicAllocation.initialExecutors")
    +      .doc("Initial number of executors to run if dynamic allocation is enabled. " +
    +        "If `--num-executors` (or `spark.executor.instances`) is set " +
    +        "and larger than this value, " +
    +        "it will be used as the initial number of executors.")
           .fallbackConf(DYN_ALLOCATION_MIN_EXECUTORS)
     
       private[spark] val DYN_ALLOCATION_MAX_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.maxExecutors").intConf.createWithDefault(Int.MaxValue)
    +    ConfigBuilder("spark.dynamicAllocation.maxExecutors")
    +      .doc("Upper bound for the number of executors if dynamic allocation is enabled. ")
    +      .intConf
    +      .createWithDefault(Int.MaxValue)
    --- End diff --
    
    It might be a bit out of place.
    In org.apache.spark.ExecutorAllocationManagerSuite.scala
    `val conf2 = conf.clone().set("spark.dynamicAllocation.maxExecutors", "-1")`
    Why is this value set less than zero?
    Not yet understood.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79634 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79634/testReport)** for PR 18555 at commit [`1c04036`](https://github.com/apache/spark/commit/1c04036d347f51a6899b91fd83f8c7e97cada3f0).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79634 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79634/testReport)** for PR 18555 at commit [`1c04036`](https://github.com/apache/spark/commit/1c04036d347f51a6899b91fd83f8c7e97cada3f0).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126090608
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -38,7 +38,12 @@ package object config {
         ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 1g, 2g). ")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      s"The driver memory must be greater than 0MB " +
    --- End diff --
    
    remove it.
    thanks.
    please review it again.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79648 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79648/testReport)** for PR 18555 at commit [`8a6b817`](https://github.com/apache/spark/commit/8a6b817d62397b780eb5f0b3b3be7dd9d3a733f1).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79609/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79421/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    cc @HyukjinKwon,


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Build finished. Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79364/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79648 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79648/testReport)** for PR 18555 at commit [`8a6b817`](https://github.com/apache/spark/commit/8a6b817d62397b780eb5f0b3b3be7dd9d3a733f1).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79361 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79361/testReport)** for PR 18555 at commit [`b19b378`](https://github.com/apache/spark/commit/b19b378ed72bfa910e889f23bc4f848324445d08).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79639/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126292668
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,63 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(v => v > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    --- End diff --
    
    Actually Configuration like `spark.dynamicAllocation.minExecutors` already has check validation logics in the code where honors it (ExecutorAllocationManager). So probably checking here again makes duplication.
    
    Also one limitation here for checking validation is that it could only verify the validity of single configuration, but for some confs like `spark.dynamicAllocation.minExecutors` they should be verified with other confs like `spark.dynamicAllocation.minExecutors <= xxxx.initialExecutors <= xxx.maxExecutors`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #91602 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/91602/testReport)** for PR 18555 at commit [`a0efb41`](https://github.com/apache/spark/commit/a0efb41fe041520bf423933b3c7b18ad2feb6711).
     * This patch **fails Spark unit tests**.
     * This patch **does not merge cleanly**.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126273417
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,22 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 1g, 2g). ")
    --- End diff --
    
    nvm, we should not mention that and always ask users to add unit.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79648/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by jiangxb1987 <gi...@git.apache.org>.
Github user jiangxb1987 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r125901919
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -38,7 +38,12 @@ package object config {
         ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    --- End diff --
    
    Can you also check for other configs in this file and add `checkValue`s as much as possible? Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126329304
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    --- End diff --
    
    It's a SparkException check test case for validation [ExecutorAllocationManager](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala#L177-L179
    ).
    
    I think you can replace `SparkException` with yours in `ExecutorAllocationManagerSuite`.
    ```
        // Min < 0
        val conf1 = conf.clone().set("spark.dynamicAllocation.minExecutors", "-1")
        intercept[SparkException] { contexts += new SparkContext(conf1) }
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79650 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79650/testReport)** for PR 18555 at commit [`722f26e`](https://github.com/apache/spark/commit/722f26e55e53dac263f3a0bce899d9881d9e01dd).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    cc @HyukjinKwon,@cloud-fan 



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79637/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile 
    sorry, Do you means ` > 0 or >= 0` also needs to be added?
    I just added a test case to  `(v => v == 0 || (1024 <= v && v < 65536)`
    test case: `test("blockManager port requires authentication ")` in SparkConfSuite.scala
    thanks you.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by kiszk <gi...@git.apache.org>.
Github user kiszk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126660036
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,65 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / 1024,
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Int.MaxValue / 1024} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    --- End diff --
    
    nit: remove space after `.`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274200
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -88,50 +137,76 @@ package object config {
         .createOptional
     
       private[spark] val PY_FILES = ConfigBuilder("spark.submit.pyFiles")
    +    .doc("Comma-separated list of .zip, .egg, or .py files to place on " +
    +      "the PYTHONPATH for Python apps.")
         .internal()
         .stringConf
         .toSequence
         .createWithDefault(Nil)
     
       private[spark] val MAX_TASK_FAILURES =
         ConfigBuilder("spark.task.maxFailures")
    +      .doc("Number of failures of any particular task before giving up on the job. " +
    +        "The total number of failures spread across different tasks " +
    +        "will not cause the job to fail; " +
    +        "a particular task has to fail this number of attempts. " +
    +        "Should be greater than or equal to 1. Number of allowed retries = this value - 1.")
           .intConf
    +      .checkValue(_ > 0, "The retry times of task should be greater than or equal to 1")
           .createWithDefault(4)
     
       // Blacklist confs
       private[spark] val BLACKLIST_ENABLED =
         ConfigBuilder("spark.blacklist.enabled")
    +      .doc("If set to 'true', prevent Spark from scheduling tasks on executors " +
    +        "that have been blacklisted due to too many task failures. " +
    +        "The blacklisting algorithm can be further controlled by " +
    +        "the other 'spark.blacklist' configuration options. ")
           .booleanConf
           .createOptional
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_EXECUTOR =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerExecutor")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one executor before the executor is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(1)
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_NODE =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerNode")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one node, before the entire node is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315850
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -251,6 +335,7 @@ package object config {
       private[spark] val FILES_MAX_PARTITION_BYTES = ConfigBuilder("spark.files.maxPartitionBytes")
         .doc("The maximum number of bytes to pack into a single partition when reading files.")
         .longConf
    +    .checkValue(_ > 0, "The maximum number of bytes event queue must not be negative")
    --- End diff --
    
    Since `_ > 0`, we should avoid 0, too.
    `must not be negative` => `must be positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126329361
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +63,66 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .checkValues(Set(false, true))
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(v => v > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    +        "Lower bound for the number of executors should be greater than or equal to 0")
    +      .createWithDefault(0)
     
       private[spark] val DYN_ALLOCATION_INITIAL_EXECUTORS =
         ConfigBuilder("spark.dynamicAllocation.initialExecutors")
    +      .doc("Initial number of executors to run if dynamic allocation is enabled. " +
    +        "If `--num-executors` (or `spark.executor.instances`) is set " +
    +        "and larger than this value, " +
    +        "it will be used as the initial number of executors.")
           .fallbackConf(DYN_ALLOCATION_MIN_EXECUTORS)
     
       private[spark] val DYN_ALLOCATION_MAX_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.maxExecutors").intConf.createWithDefault(Int.MaxValue)
    +    ConfigBuilder("spark.dynamicAllocation.maxExecutors")
    +      .doc("Upper bound for the number of executors if dynamic allocation is enabled. ")
    +      .intConf
    +      .createWithDefault(Int.MaxValue)
    --- End diff --
    
    This is the same.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r129946652
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkConfSuite.scala ---
    @@ -322,6 +324,291 @@ class SparkConfSuite extends SparkFunSuite with LocalSparkContext with ResetSyst
         conf.validateSettings()
       }
     
    +  test("verify spark.blockManager.port configuration") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("My app")
    +
    +    conf.validateSettings()
    +    assert(!conf.contains(BLOCK_MANAGER_PORT.key))
    +
    +    Seq(
    +      "0", // normal values
    +      "1024", // min values
    +      "65535" // max values
    +    ).foreach { value =>
    +      conf.set(BLOCK_MANAGER_PORT.key, value)
    +      var sc0 = new SparkContext(conf)
    +      assert(sc0.isStopped === false)
    +      assert(sc0.conf.get(BLOCK_MANAGER_PORT) === value.toInt)
    +      sc0.stop()
    +      conf.remove(BLOCK_MANAGER_PORT)
    +    }
    +
    +    // Verify abnormal values
    +    Seq(
    +      "-1",
    +      "1000",
    +      "65536"
    +    ).foreach { value =>
    +      conf.set(BLOCK_MANAGER_PORT.key, value)
    +      val excMsg = intercept[IllegalArgumentException] {
    +        new SparkContext(conf)
    +      }.getMessage
    +      // Caused by: java.lang.IllegalArgumentException:
    +      // blockManager port should be between 1024 and 65535 (inclusive),
    +      // or 0 for a random free port.
    +      assert(excMsg.contains("blockManager port should be between 1024 " +
    +        "and 65535 (inclusive), or 0 for a random free port."))
    +
    +      conf.remove(BLOCK_MANAGER_PORT)
    +    }
    +  }
    +
    +  test("verify spark.executor.memory configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("executor memory")
    +      .set(EXECUTOR_MEMORY.key, "-1")
    +    val excMsg = intercept[NumberFormatException] {
    +      sc = new SparkContext(conf)
    +    }.getMessage
    +    // Caused by: java.lang.NumberFormatException:
    +    // Size must be specified as bytes (b), kibibytes (k),
    +    // mebibytes (m), gibibytes (g), tebibytes (t),
    +    // or pebibytes(p). E.g. 50b, 100k, or 250m.
    +    assert(excMsg.contains("Size must be specified as bytes (b), kibibytes (k), " +
    +      "mebibytes (m), gibibytes (g), tebibytes (t), or pebibytes(p). E.g. 50b, 100k, or 250m."))
    +  }
    +
    +  test("verify spark.task.cpus configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("cpus")
    +      .set(CPUS_PER_TASK.key, "-1")
    +    val excMsg = intercept[IllegalArgumentException] {
    +      sc = new SparkContext(conf)
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // Number of cores to allocate for task event queue must be positive.
    +    assert(excMsg.contains("Number of cores to allocate for task event queue must be positive."))
    +  }
    +
    +  test("verify spark.task.maxFailures configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("task maxFailures")
    +      .set(MAX_TASK_FAILURES.key, "-1")
    +    val sc0 = new SparkContext(conf)
    +    val excMsg = intercept[IllegalArgumentException] {
    +      new TaskSchedulerImpl(sc0)
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // The retry times of task should be greater than or equal to 1.
    +    assert(excMsg.contains("The retry times of task should be greater than or equal to 1."))
    +    sc0.stop()
    +  }
    +
    +  test("verify listenerbus.eventqueue.capacity configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("capacity")
    +      .set(LISTENER_BUS_EVENT_QUEUE_CAPACITY.key, "-1")
    +    val excMsg = intercept[IllegalArgumentException] {
    +      sc = new SparkContext(conf)
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // The capacity of listener bus event queue must be positive.
    +    assert(excMsg.contains("The capacity of listener bus event queue must be positive."))
    +  }
    +
    +  test("verify metrics.maxListenerClassesTimed configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("listenerbus")
    +      .set(LISTENER_BUS_METRICS_MAX_LISTENER_CLASSES_TIMED.key, "-1")
    +    val excMsg = intercept[IllegalArgumentException] {
    +      sc = new SparkContext(conf)
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // The maxListenerClassesTimed of listener bus event queue must be positive.
    +    assert(excMsg.contains("The maxListenerClassesTimed of listener bus " +
    +      "event queue must be positive."))
    +  }
    +
    +  test("verify spark.ui.retained configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("retained")
    +      .set(UI_RETAINED_TASKS.key, "-1")
    +    val excMsg = intercept[IllegalArgumentException] {
    +      sc = new SparkContext(conf)
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // Number of Tasks event queue must be positive.
    +    assert(excMsg.contains("Number of Tasks event queue must be positive."))
    +  }
    +
    +  test("verify spark.files.maxPartitionBytes configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("maxPartitionBytes")
    +      .set(FILES_MAX_PARTITION_BYTES.key, "-1")
    +    sc = new SparkContext(conf)
    +    val tempDir = Utils.createTempDir()
    +    val binaryRDD = sc.binaryFiles(tempDir.getAbsolutePath)
    +    val excMsg = intercept[IllegalArgumentException] {
    +      // get RDD partitions is error
    +      binaryRDD.partitions
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // The maximum number of bytes event queue must be positive.
    +    assert(excMsg.contains("The maximum number of bytes event queue must be positive."))
    +  }
    +
    +  test("verify spark.files.openCostInBytes configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("openCostInBytes")
    +      .set(FILES_OPEN_COST_IN_BYTES.key, "-1")
    +    sc = new SparkContext(conf)
    +    val tempDir = Utils.createTempDir()
    +    val binaryRDD = sc.binaryFiles(tempDir.getAbsolutePath)
    +    val excMsg = intercept[IllegalArgumentException] {
    +      // get RDD partitions is error
    +      binaryRDD.partitions
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // The number of CostInBytes event queue must be positive.
    +    assert(excMsg.contains("The number of CostInBytes event queue must be positive."))
    +  }
    +
    +  test("verify spark.shuffle.accurateBlockThreshold configuration exception") {
    +    val conf = new SparkConf(false)
    +      .set("spark.serializer", classOf[KryoSerializer].getName)
    +      .setMaster("local").setAppName("accurateBlockThreshold")
    +      .set(SHUFFLE_ACCURATE_BLOCK_THRESHOLD.key, "-1")
    +    sc = new SparkContext(conf)
    +    val rdd = sc.parallelize(0 until 3000, 10).repartition(2001)
    +    val excMsg = intercept[SparkException] {
    +      rdd.collect().length
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // The number of BlockThreshold event queue must be positive.
    +    assert(excMsg.contains("The number of BlockThreshold event queue must be positive."))
    +  }
    +
    +  test("verify spark.shuffle.registration.timeout configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("timeout")
    +      .set(SHUFFLE_SERVICE_ENABLED, true)
    +      .set(SHUFFLE_REGISTRATION_TIMEOUT.key, "-1")
    +    val excMsg = intercept[IllegalArgumentException] {
    +      sc = new SparkContext(conf)
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // Timeout in milliseconds for registration event queue must be positive.
    +    assert(excMsg.contains("Timeout in milliseconds for registration " +
    +      "event queue must be positive."))
    +    conf.remove(SHUFFLE_SERVICE_ENABLED)
    +  }
    +
    +  test("verify spark.reducer.maxReqSizeShuffleToMem configuration exception") {
    +    val conf = new SparkConf(false)
    +      .setMaster("local").setAppName("maxReqSizeShuffleToMem")
    +      .set(REDUCER_MAX_REQ_SIZE_SHUFFLE_TO_MEM.key, "-1")
    +    sc = new SparkContext(conf)
    +    val a = sc.parallelize(1 to 5, 201)
    +    val b = a.map(x => (x, x*2))
    +    val c = new ShuffledRDD[Int, Int, Int](b, new HashPartitioner(201))
    +      .setSerializer(new KryoSerializer(conf))
    +    val excMsg = intercept[SparkException] {
    +      c.collect()
    +    }.getMessage
    +    // Caused by: java.lang.IllegalArgumentException:
    +    // Size of the request is above this threshold event queue must be positive.
    +    assert(excMsg.contains("Size of the request is above this threshold " +
    +      "event queue must be positive."))
    +  }
    +
    +  test("verify shuffle file buffer size configuration") {
    +    val conf = new SparkConf(false)
    +
    +    // 2097151: Int.MaxValue / 1024
    +    val maxvalue = 2097151L
    +
    +    Seq(
    +      SHUFFLE_FILE_BUFFER_SIZE,
    +      SHUFFLE_UNSAFE_FILE_OUTPUT_BUFFER_SIZE
    +    ).foreach { config =>
    +      assert(!conf.contains(config))
    +
    +      // min values
    +      conf.set(config.key, "1k")
    +      assert(conf.get(config) === 1)
    +
    +      // max values
    +      conf.set(config.key, s"${maxvalue.toString}k")
    +      assert(conf.get(config) === maxvalue)
    +
    +      // Verify exception values
    +      Seq(
    +        "0k",
    +        "-1k",
    +        s"${(maxvalue + 1).toString}k"
    +      ).foreach { value =>
    +        conf.set(config.key, value)
    +        var sc0 = new SparkContext("local", "test", conf)
    +        val a = sc0.parallelize(1 to 5, 201)
    +        val b = a.map(x => (x, x*2))
    +        val c = new ShuffledRDD[Int, Int, Int](b, new HashPartitioner(201))
    +          .setSerializer(new KryoSerializer(conf))
    +
    +        val excMsg = intercept[SparkException] {
    +          c.collect()
    +        }.getMessage
    +        // Caused by: java.lang.IllegalArgumentException:
    +        // The file buffer size must be greater than 0 and less than ${Int.MaxValue / 1024}.
    +        // or The buffer size must be greater than 0 and less than ${Int.MaxValue / 1024}.
    +        assert(excMsg.contains("buffer size must be greater than 0 " +
    +          s"and less than ${Int.MaxValue / 1024}."))
    +        sc0.stop()
    +        conf.remove(config)
    +      }
    +    }
    +  }
    +
    +  test("verify spark.shuffle.spill.diskWriteBufferSize configuration") {
    +    val conf = new SparkConf(false)
    +
    +    // 2147483647: Int.MaxValue
    +    val maxvalue = 2147483647L
    +
    +    assert(!conf.contains(SHUFFLE_DISK_WRITE_BUFFER_SIZE))
    +
    +    // min values
    +    conf.set(SHUFFLE_DISK_WRITE_BUFFER_SIZE.key, "1024B")
    +    assert(conf.get(SHUFFLE_DISK_WRITE_BUFFER_SIZE) === 1024)
    +
    +    // max values
    +    conf.set(SHUFFLE_DISK_WRITE_BUFFER_SIZE.key, s"${maxvalue.toString}B")
    +    assert(conf.get(SHUFFLE_DISK_WRITE_BUFFER_SIZE) === maxvalue)
    +
    +    // Verify exception values
    +    Seq(
    +      "0B",
    +      "-1B",
    +      s"${(maxvalue + 1).toString}B"
    +    ).foreach { value =>
    +      conf.set(SHUFFLE_DISK_WRITE_BUFFER_SIZE.key, value)
    +      var sc0 = new SparkContext("local", "test", conf)
    +      val a = sc0.parallelize(1 to 5, 201)
    +      val b = a.map(x => (x, x*2))
    +      val c = new ShuffledRDD[Int, Int, Int](b, new HashPartitioner(201))
    +        .setSerializer(new KryoSerializer(conf))
    +
    +      val excMsg = intercept[SparkException] {
    +        c.collect()
    +      }.getMessage
    +      // Caused by: java.lang.IllegalArgumentException:
    +      // The buffer size must be greater than 0 and less than ${Int.MaxValue}.
    +      assert(excMsg.contains("The buffer size must be greater than 0 " +
    +        s"and less than ${Int.MaxValue}."))
    +      sc0.stop()
    +      conf.remove(SHUFFLE_DISK_WRITE_BUFFER_SIZE)
    +    }
    +  }
    --- End diff --
    
    Thanks for adding these test cases! Could you do me a favor and run each test case in a separate branch without any extra change? and post what are the current behaviors? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79639 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79639/testReport)** for PR 18555 at commit [`3983744`](https://github.com/apache/spark/commit/3983744794359f36b20111d5d7e60d042c027426).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #80117 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80117/testReport)** for PR 18555 at commit [`92d86b2`](https://github.com/apache/spark/commit/92d86b2a216ff9eddb1dc31c7c89472e1e0298fb).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315874
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -327,6 +421,8 @@ package object config {
           .doc("The blocks of a shuffle request will be fetched to disk when size of the request is " +
             "above this threshold. This is to avoid a giant request takes too much memory.")
           .bytesConf(ByteUnit.BYTE)
    +      .checkValue(_ > 0,
    +        "Size of the request is above this threshold event queue must not be negative")
    --- End diff --
    
    ditto. `positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79460 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79460/testReport)** for PR 18555 at commit [`51f0cec`](https://github.com/apache/spark/commit/51f0cecebc950a5e61fb9a9290093982b4bbc1b1).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79443 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79443/testReport)** for PR 18555 at commit [`3df8feb`](https://github.com/apache/spark/commit/3df8feb21b091ac4c6d42f303bfce13d4468b190).
     * This patch **fails due to an unknown error code, -9**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126293302
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,21 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 512, 1g, 2g). ")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    --- End diff --
    
    IIUC, Should it here be `Int.MaxValue / 1024`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #87024 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87024/testReport)** for PR 18555 at commit [`c213be6`](https://github.com/apache/spark/commit/c213be69ae545cc360a9480969c1aa02d1e51e39).
     * This patch **fails due to an unknown error code, -9**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #87048 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87048/testReport)** for PR 18555 at commit [`c213be6`](https://github.com/apache/spark/commit/c213be69ae545cc360a9480969c1aa02d1e51e39).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315636
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
    --- End diff --
    
    ditto.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r166164325
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -231,6 +315,9 @@ package object config {
       private[spark] val BLOCK_MANAGER_PORT = ConfigBuilder("spark.blockManager.port")
         .doc("Port to use for the block manager when a more specific setting is not provided.")
         .intConf
    +    .checkValue(v => v == 0 || (1024 <= v && v < 65536),
    --- End diff --
    
    @jerryshao , @HyukjinKwon 
    Because of the same port control at startServiceOnPort, do we need to open it here?
    
      def startServiceOnPort[T](
          startPort: Int,
          startService: Int => (T, Int),
          conf: SparkConf,
          serviceName: String = ""): (T, Int) = {
    
        require(startPort == 0 || (1024 <= startPort && startPort < 65536),
          "startPort should be between 1024 and 65535 (inclusive), or 0 for a random free port.")
         
         ...
    }


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r125938087
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -38,7 +38,12 @@ package object config {
         ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 1g, 2g). ")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      s"The driver memory must be greater than 0MB " +
    --- End diff --
    
    I think we can remove 's'.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79926/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79428 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79428/testReport)** for PR 18555 at commit [`9f924e5`](https://github.com/apache/spark/commit/9f924e53e83a3edd9413d4efc4a73b6053b0d6df).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79424 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79424/testReport)** for PR 18555 at commit [`ee9ded8`](https://github.com/apache/spark/commit/ee9ded81dbca519edfa0293381fc828307a96649).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r166164886
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -231,6 +315,9 @@ package object config {
       private[spark] val BLOCK_MANAGER_PORT = ConfigBuilder("spark.blockManager.port")
         .doc("Port to use for the block manager when a more specific setting is not provided.")
         .intConf
    +    .checkValue(v => v == 0 || (1024 <= v && v < 65536),
    --- End diff --
    
    Ah, do you mean the check was already there and the current change just brings it ahead to make it fail fast? If so, that's fine.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #86961 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/86961/testReport)** for PR 18555 at commit [`cac5c03`](https://github.com/apache/spark/commit/cac5c030919660bcfbf98841f7da159644d83f58).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/91602/
    Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @jiangxb1987 
    thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #80540 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80540/testReport)** for PR 18555 at commit [`3135235`](https://github.com/apache/spark/commit/31352352310eb36a9bc47d346781b02288795be1).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79926 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79926/testReport)** for PR 18555 at commit [`26afb05`](https://github.com/apache/spark/commit/26afb05a78447590144def42a54d9f88095c78a5).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80540/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @heary-cao Sorry, I did not find the test cases. Not comfortable to merge the PR without the test cases.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126614860
  
    --- Diff: docs/configuration.md ---
    @@ -142,7 +142,7 @@ of the most common options to set are:
       <td>1g</td>
       <td>
         Amount of memory to use for the driver process, i.e. where SparkContext is initialized.
    -    (e.g. <code>1g</code>, <code>2g</code>).
    +    (e.g. <code>512</code>, <code>1g</code>, <code>2g</code>).
    --- End diff --
    
    I'd like to revert this change, to encourage users always put a unit


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315840
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -175,6 +244,8 @@ package object config {
         ConfigBuilder("spark.scheduler.listenerbus.metrics.maxListenerClassesTimed")
           .internal()
           .intConf
    +      .checkValue(_ > 0,
    +        "The maxListenerClassesTimed of listener bus event queue must not be negative")
    --- End diff --
    
    Since `_ > 0`, we should avoid 0, too.
    `must not be negative` => `must be positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/80117/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Seems fine.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    BTW most of the `> 0` and `>= 0` don't need test as they are so obvious


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126326738
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    --- End diff --
    
    It might be a bit out of place.
    In org.apache.spark.ExecutorAllocationManagerSuite.scala
    val conf2 = conf.clone().set("spark.dynamicAllocation.minExecutors", "-1")
    Why is this value set less than zero?
    Not yet understood.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79429/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79927/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Hmm .. why not addressing https://github.com/apache/spark/pull/18555#discussion_r126293557? I think that comment makes sense. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126273432
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +63,66 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .checkValues(Set(false, true))
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79431 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79431/testReport)** for PR 18555 at commit [`8257e8a`](https://github.com/apache/spark/commit/8257e8a23d113063417d565aaa7a4d2819de4596).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    cc @gatorsmile 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315617
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,21 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 512, 1g, 2g). ")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The driver memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
    --- End diff --
    
    nit, the maximum value for `spark.driver.memory` in a single machine looks too unrealistic, 8191 PB?
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274183
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +63,66 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .checkValues(Set(false, true))
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(v => v > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    +        "Lower bound for the number of executors should be greater than or equal to 0")
    +      .createWithDefault(0)
     
       private[spark] val DYN_ALLOCATION_INITIAL_EXECUTORS =
         ConfigBuilder("spark.dynamicAllocation.initialExecutors")
    +      .doc("Initial number of executors to run if dynamic allocation is enabled. " +
    +        "If `--num-executors` (or `spark.executor.instances`) is set " +
    +        "and larger than this value, " +
    +        "it will be used as the initial number of executors.")
           .fallbackConf(DYN_ALLOCATION_MIN_EXECUTORS)
    --- End diff --
    
    check if >= 0?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79424 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79424/testReport)** for PR 18555 at commit [`ee9ded8`](https://github.com/apache/spark/commit/ee9ded81dbca519edfa0293381fc828307a96649).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274201
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -88,50 +137,76 @@ package object config {
         .createOptional
     
       private[spark] val PY_FILES = ConfigBuilder("spark.submit.pyFiles")
    +    .doc("Comma-separated list of .zip, .egg, or .py files to place on " +
    +      "the PYTHONPATH for Python apps.")
         .internal()
         .stringConf
         .toSequence
         .createWithDefault(Nil)
     
       private[spark] val MAX_TASK_FAILURES =
         ConfigBuilder("spark.task.maxFailures")
    +      .doc("Number of failures of any particular task before giving up on the job. " +
    +        "The total number of failures spread across different tasks " +
    +        "will not cause the job to fail; " +
    +        "a particular task has to fail this number of attempts. " +
    +        "Should be greater than or equal to 1. Number of allowed retries = this value - 1.")
           .intConf
    +      .checkValue(_ > 0, "The retry times of task should be greater than or equal to 1")
           .createWithDefault(4)
     
       // Blacklist confs
       private[spark] val BLACKLIST_ENABLED =
         ConfigBuilder("spark.blacklist.enabled")
    +      .doc("If set to 'true', prevent Spark from scheduling tasks on executors " +
    +        "that have been blacklisted due to too many task failures. " +
    +        "The blacklisting algorithm can be further controlled by " +
    +        "the other 'spark.blacklist' configuration options. ")
           .booleanConf
           .createOptional
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_EXECUTOR =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerExecutor")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one executor before the executor is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(1)
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_NODE =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerNode")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one node, before the entire node is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILURES_PER_EXEC =
         ConfigBuilder("spark.blacklist.application.maxFailedTasksPerExecutor")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79930 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79930/testReport)** for PR 18555 at commit [`4c55f61`](https://github.com/apache/spark/commit/4c55f6112e87d1305994ce13bdc3216360f32ecb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315844
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -183,32 +254,42 @@ package object config {
         .createOptional
     
       private[spark] val PYSPARK_DRIVER_PYTHON = ConfigBuilder("spark.pyspark.driver.python")
    +    .doc("Python binary executable to use for PySpark in driver.")
         .stringConf
         .createOptional
     
       private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
    +    .doc("Python binary executable to use for PySpark in both driver and executors.")
         .stringConf
         .createOptional
     
       // To limit memory usage, we only track information for a fixed number of tasks
       private[spark] val UI_RETAINED_TASKS = ConfigBuilder("spark.ui.retainedTasks")
    +    .doc("How many tasks the Spark UI and status APIs remember before garbage collecting.")
         .intConf
    +    .checkValue(_ > 0, "Number of Tasks event queue must not be negative")
    --- End diff --
    
    Since `_ > 0`, we should avoid 0, too.
    `must not be negative` => `must be positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79635 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79635/testReport)** for PR 18555 at commit [`5b2c9cc`](https://github.com/apache/spark/commit/5b2c9cc9464b00a0e9925b7bc06e16389bbcfc7c).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Thanks! @heary-cao 
    
    cc @jiangxb1987 Could you take a look to ensure no behavior change will be caused by this PR?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    ok to test


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79460/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79443/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Seems https://github.com/apache/spark/pull/18555#discussion_r126293557 is missed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r125938115
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -54,7 +59,11 @@ package object config {
         ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      s"The executor memory must be greater than 0M " +
    --- End diff --
    
    ditto.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile @cloud-fan 
    I added new test case again. except
    ```
          DYN_ALLOCATION_MIN_EXECUTORS
          DYN_ALLOCATION_INITIAL_EXECUTORS
          DYN_ALLOCATION_MAX_EXECUTORS
          MAX_TASK_ATTEMPTS_PER_EXECUTOR,
          MAX_TASK_ATTEMPTS_PER_NODE,
          MAX_FAILURES_PER_EXEC_STAGE,
          MAX_FAILED_EXEC_PER_NODE_STAGE,
          MAX_FAILURES_PER_EXEC,
          MAX_FAILED_EXEC_PER_NODE,
          BLACKLIST_TIMEOUT_CONF
    ```
    Because they've already exist test case in other test suites.
    Could you review it again?
    thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79431 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79431/testReport)** for PR 18555 at commit [`8257e8a`](https://github.com/apache/spark/commit/8257e8a23d113063417d565aaa7a4d2819de4596).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79602 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79602/testReport)** for PR 18555 at commit [`fcb9329`](https://github.com/apache/spark/commit/fcb9329220b51a43dddf8ac6a550295c8891d23b).
     * This patch **fails due to an unknown error code, -9**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79650/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile 
    supplementary test case  to  (v => v == 0 || (1024 <= v && v < 65536).
    please review it again.
    thank you.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79637 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79637/testReport)** for PR 18555 at commit [`17afdd7`](https://github.com/apache/spark/commit/17afdd7a2f5fa4333bd243cad78265f28eb1f62d).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79431/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao closed the pull request at:

    https://github.com/apache/spark/pull/18555


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by kiszk <gi...@git.apache.org>.
Github user kiszk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126660503
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,65 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / 1024,
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Int.MaxValue / 1024} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must be positive.")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    +        "Lower bound for the number of executors should be greater than or equal to 0.")
    +      .createWithDefault(0)
     
       private[spark] val DYN_ALLOCATION_INITIAL_EXECUTORS =
         ConfigBuilder("spark.dynamicAllocation.initialExecutors")
    +      .doc("Initial number of executors to run if dynamic allocation is enabled. " +
    +        "If `--num-executors` (or `spark.executor.instances`) is set " +
    +        "and larger than this value, " +
    +        "it will be used as the initial number of executors.")
           .fallbackConf(DYN_ALLOCATION_MIN_EXECUTORS)
     
       private[spark] val DYN_ALLOCATION_MAX_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.maxExecutors").intConf.createWithDefault(Int.MaxValue)
    +    ConfigBuilder("spark.dynamicAllocation.maxExecutors")
    +      .doc("Upper bound for the number of executors if dynamic allocation is enabled. ")
    --- End diff --
    
    nit: remove space after `.` if no additional string.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126308199
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,63 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(v => v > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    --- End diff --
    
    thank you for answer it.
    I have remove CheckValue.
    thank you.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79549/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile 
    Thank you for review it again. quite agree with your suggestion.
    Later, add test cases to each of the `checkvalue`. and love to work on spark. thank you again.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126273429
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +63,66 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274198
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -88,50 +137,76 @@ package object config {
         .createOptional
     
       private[spark] val PY_FILES = ConfigBuilder("spark.submit.pyFiles")
    +    .doc("Comma-separated list of .zip, .egg, or .py files to place on " +
    +      "the PYTHONPATH for Python apps.")
         .internal()
         .stringConf
         .toSequence
         .createWithDefault(Nil)
     
       private[spark] val MAX_TASK_FAILURES =
         ConfigBuilder("spark.task.maxFailures")
    +      .doc("Number of failures of any particular task before giving up on the job. " +
    +        "The total number of failures spread across different tasks " +
    +        "will not cause the job to fail; " +
    +        "a particular task has to fail this number of attempts. " +
    +        "Should be greater than or equal to 1. Number of allowed retries = this value - 1.")
           .intConf
    +      .checkValue(_ > 0, "The retry times of task should be greater than or equal to 1")
           .createWithDefault(4)
     
       // Blacklist confs
       private[spark] val BLACKLIST_ENABLED =
         ConfigBuilder("spark.blacklist.enabled")
    +      .doc("If set to 'true', prevent Spark from scheduling tasks on executors " +
    +        "that have been blacklisted due to too many task failures. " +
    +        "The blacklisting algorithm can be further controlled by " +
    +        "the other 'spark.blacklist' configuration options. ")
           .booleanConf
           .createOptional
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_EXECUTOR =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerExecutor")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one executor before the executor is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
    --- End diff --
    
    `should be greater than 0`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79932/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79933 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79933/testReport)** for PR 18555 at commit [`0970a89`](https://github.com/apache/spark/commit/0970a891d02865ef55d2d007a38bbe7ebae66877).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79933 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79933/testReport)** for PR 18555 at commit [`0970a89`](https://github.com/apache/spark/commit/0970a891d02865ef55d2d007a38bbe7ebae66877).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #80540 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/80540/testReport)** for PR 18555 at commit [`3135235`](https://github.com/apache/spark/commit/31352352310eb36a9bc47d346781b02288795be1).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274204
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -88,50 +137,76 @@ package object config {
         .createOptional
     
       private[spark] val PY_FILES = ConfigBuilder("spark.submit.pyFiles")
    +    .doc("Comma-separated list of .zip, .egg, or .py files to place on " +
    +      "the PYTHONPATH for Python apps.")
         .internal()
         .stringConf
         .toSequence
         .createWithDefault(Nil)
     
       private[spark] val MAX_TASK_FAILURES =
         ConfigBuilder("spark.task.maxFailures")
    +      .doc("Number of failures of any particular task before giving up on the job. " +
    +        "The total number of failures spread across different tasks " +
    +        "will not cause the job to fail; " +
    +        "a particular task has to fail this number of attempts. " +
    +        "Should be greater than or equal to 1. Number of allowed retries = this value - 1.")
           .intConf
    +      .checkValue(_ > 0, "The retry times of task should be greater than or equal to 1")
           .createWithDefault(4)
     
       // Blacklist confs
       private[spark] val BLACKLIST_ENABLED =
         ConfigBuilder("spark.blacklist.enabled")
    +      .doc("If set to 'true', prevent Spark from scheduling tasks on executors " +
    +        "that have been blacklisted due to too many task failures. " +
    +        "The blacklisting algorithm can be further controlled by " +
    +        "the other 'spark.blacklist' configuration options. ")
           .booleanConf
           .createOptional
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_EXECUTOR =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerExecutor")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one executor before the executor is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(1)
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_NODE =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerNode")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one node, before the entire node is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILURES_PER_EXEC =
         ConfigBuilder("spark.blacklist.application.maxFailedTasksPerExecutor")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILURES_PER_EXEC_STAGE =
         ConfigBuilder("spark.blacklist.stage.maxFailedTasksPerExecutor")
    +      .doc("How many different tasks must fail on one executor, " +
    +        "within one stage, before the executor is blacklisted for that stage.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILED_EXEC_PER_NODE =
         ConfigBuilder("spark.blacklist.application.maxFailedExecutorsPerNode")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79424/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile 
    thank you.
    I have update it.
    please review it again.
    thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79549 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79549/testReport)** for PR 18555 at commit [`dd066b6`](https://github.com/apache/spark/commit/dd066b60bcb4257d3825a47840b68b9b55f4131a).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79930/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    retest this please


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274206
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -88,50 +137,76 @@ package object config {
         .createOptional
     
       private[spark] val PY_FILES = ConfigBuilder("spark.submit.pyFiles")
    +    .doc("Comma-separated list of .zip, .egg, or .py files to place on " +
    +      "the PYTHONPATH for Python apps.")
         .internal()
         .stringConf
         .toSequence
         .createWithDefault(Nil)
     
       private[spark] val MAX_TASK_FAILURES =
         ConfigBuilder("spark.task.maxFailures")
    +      .doc("Number of failures of any particular task before giving up on the job. " +
    +        "The total number of failures spread across different tasks " +
    +        "will not cause the job to fail; " +
    +        "a particular task has to fail this number of attempts. " +
    +        "Should be greater than or equal to 1. Number of allowed retries = this value - 1.")
           .intConf
    +      .checkValue(_ > 0, "The retry times of task should be greater than or equal to 1")
           .createWithDefault(4)
     
       // Blacklist confs
       private[spark] val BLACKLIST_ENABLED =
         ConfigBuilder("spark.blacklist.enabled")
    +      .doc("If set to 'true', prevent Spark from scheduling tasks on executors " +
    +        "that have been blacklisted due to too many task failures. " +
    +        "The blacklisting algorithm can be further controlled by " +
    +        "the other 'spark.blacklist' configuration options. ")
           .booleanConf
           .createOptional
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_EXECUTOR =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerExecutor")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one executor before the executor is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(1)
     
       private[spark] val MAX_TASK_ATTEMPTS_PER_NODE =
         ConfigBuilder("spark.blacklist.task.maxTaskAttemptsPerNode")
    +      .doc("For a given task, how many times it can be " +
    +        "retried on one node, before the entire node is blacklisted for that task.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILURES_PER_EXEC =
         ConfigBuilder("spark.blacklist.application.maxFailedTasksPerExecutor")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILURES_PER_EXEC_STAGE =
         ConfigBuilder("spark.blacklist.stage.maxFailedTasksPerExecutor")
    +      .doc("How many different tasks must fail on one executor, " +
    +        "within one stage, before the executor is blacklisted for that stage.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILED_EXEC_PER_NODE =
         ConfigBuilder("spark.blacklist.application.maxFailedExecutorsPerNode")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
           .createWithDefault(2)
     
       private[spark] val MAX_FAILED_EXEC_PER_NODE_STAGE =
         ConfigBuilder("spark.blacklist.stage.maxFailedExecutorsPerNode")
    +      .doc("How many different executors are marked as blacklisted for a given stage, " +
    +        "before the entire node is marked as failed for the stage.")
           .intConf
    +      .checkValue(_ > 0, "The value should be greater than or equal to 1")
    --- End diff --
    
    ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126308228
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -35,10 +35,21 @@ package object config {
         ConfigBuilder(SparkLauncher.DRIVER_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val DRIVER_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.driver.userClassPathFirst")
    +      .doc("Whether to give user-added jars precedence over Spark's own jars " +
    +        "when loading classes in the driver. This feature can be used to mitigate " +
    +        "conflicts between Spark's dependencies and user dependencies. " +
    +        "It is currently an experimental feature. This is used in cluster mode only. ")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, i.e. " +
    +      "where SparkContext is initialized. (e.g. 512, 1g, 2g). ")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    --- End diff --
    
    sorry . 
    Int to Long.
    I think MiB for 1024 * 1024.
    thanks. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79930 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79930/testReport)** for PR 18555 at commit [`4c55f61`](https://github.com/apache/spark/commit/4c55f6112e87d1305994ce13bdc3216360f32ecb).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org

[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #84786 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/84786/testReport)** for PR 18555 at commit [`3135235`](https://github.com/apache/spark/commit/31352352310eb36a9bc47d346781b02288795be1).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87053/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126343554
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,61 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Long.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0 MiB " +
    +      s"and less than ${Long.MaxValue / (1024 * 1024)} MiB.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(_ > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    --- End diff --
    
    thank you.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Checking whether the values are set or not is not enough. We need to check whether these parameters are effective or not. That means, we need to check the behaviors of Spark


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315864
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -306,19 +396,23 @@ package object config {
             "record the size accurately if it's above this config. This helps to prevent OOM by " +
             "avoiding underestimating shuffle block size when fetch shuffle blocks.")
           .bytesConf(ByteUnit.BYTE)
    +      .checkValue(_ > 0, "The number of BlockThreshold event queue must not be negative")
    --- End diff --
    
    ditto. `positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315856
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -259,6 +344,7 @@ package object config {
           " over estimate, then the partitions with small files will be faster than partitions with" +
           " bigger files.")
         .longConf
    +    .checkValue(_ > 0, "The number of CostInBytes event queue must not be negative")
    --- End diff --
    
    Since `_ > 0`, we should avoid 0, too.
    `must not be negative` => `must be positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by dongjoon-hyun <gi...@git.apache.org>.
Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126315870
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -306,19 +396,23 @@ package object config {
             "record the size accurately if it's above this config. This helps to prevent OOM by " +
             "avoiding underestimating shuffle block size when fetch shuffle blocks.")
           .bytesConf(ByteUnit.BYTE)
    +      .checkValue(_ > 0, "The number of BlockThreshold event queue must not be negative")
           .createWithDefault(100 * 1024 * 1024)
     
       private[spark] val SHUFFLE_REGISTRATION_TIMEOUT =
         ConfigBuilder("spark.shuffle.registration.timeout")
           .doc("Timeout in milliseconds for registration to the external shuffle service.")
           .timeConf(TimeUnit.MILLISECONDS)
    +      .checkValue(_ > 0,
    +        "Timeout in milliseconds for registration event queue must not be negative")
           .createWithDefault(5000)
     
       private[spark] val SHUFFLE_REGISTRATION_MAX_ATTEMPTS =
         ConfigBuilder("spark.shuffle.registration.maxAttempts")
           .doc("When we fail to register to the external shuffle service, we will " +
             "retry for maxAttempts times.")
           .intConf
    +      .checkValue(_ > 0, "Retry for maxAttempts times event queue must not be negative")
    --- End diff --
    
    ditto. `positive`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87024/
    Test FAILed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [SPARK-21353][CORE]add checkValue in spark.intern...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r165838720
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -231,6 +315,9 @@ package object config {
       private[spark] val BLOCK_MANAGER_PORT = ConfigBuilder("spark.blockManager.port")
         .doc("Port to use for the block manager when a more specific setting is not provided.")
         .intConf
    +    .checkValue(v => v == 0 || (1024 <= v && v < 65536),
    --- End diff --
    
    +1 on ^


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile 
    Thank you very much for reviewing the code again.
    and thank you very much for teaching me to be familiar with the spark step by step.
    Here is the result of run each test case in the master branch without any extra change
    
    ```
    test("verify spark.blockManager.port configuration")
    current behaviors:
    Caused by: java.lang.IllegalArgumentException: 
    requirement failed: startPort should be between 1024 and 65535 (inclusive),
    or 0 for a random free port.
    
    test("verify spark.executor.memory configuration exception")
    current behaviors:
    Caused by: java.lang.NumberFormatException: Size must be specified as bytes (b), 
    kibibytes (k), mebibytes (m), gibibytes (g), tebibytes (t), or pebibytes(p). 
    E.g. 50b, 100k, or 250m.
    
    test("verify listenerbus.eventqueue.capacity configuration exception")
    current behaviors:
    Caused by: java.lang.IllegalArgumentException: 
    The capacity of listener bus event queue must not be negative
    
    test("verify shuffle file buffer size configuration")
    current behaviors:
    Caused by: java.lang.IllegalArgumentException: Buffer size <= 0
    
    test("verify spark.shuffle.spill.diskWriteBufferSize configuration")
    current behaviors:
    while (dataRemaining > 0) Dead loop or Caused by: java.lang.NegativeArraySizeException
    ```
    
    ```
    test("verify spark.shuffle.registration.timeout configuration exception")
    current behaviors:no exception was thrown.
    **Analysis**
    However, Caused by: java.util.concurrent.TimeoutException: 
    Timeout waiting for task. is reported in registerWithShuffleServer
    
    test("verify spark.task.cpus configuration exception")
    current behaviors:no exception was thrown.
    **Analysis**
    when spark.task.cpus = -1 
    executorData.freeCores -= scheduler.CPUS_PER_TASK
    spark executor will be more and more idle kernel
    
    test("verify spark.reducer.maxReqSizeShuffleToMem configuration exception")
    current behaviors:no exception was thrown.
    **Analysis**
    // Fetch remote shuffle blocks to disk when the request is too large. Since the shuffle data is
    // already encrypted and compressed over the wire(w.r.t. the related configs), we can just fetch
    // the data and write it to file directly.
    ```
    
    > Suggest: I will remove these checkValue and unit tests. 
    
    ```
    test("verify spark.shuffle.registration.maxAttempts configuration exception")
    current behaviors:no exception was thrown.
    
    test("verify spark.task.maxFailures configuration exception")
    current behaviors:no exception was thrown.
    
    test("verify metrics.maxListenerClassesTimed configuration exception")
    current behaviors:no exception was thrown.
    
    test("verify spark.ui.retained configuration exception")
    current behaviors:no exception was thrown.
    
    test("verify spark.files.maxPartitionBytes configuration exception")
    current behaviors:no exception was thrown.
    
    test("verify spark.files.openCostInBytes configuration exception")
    current behaviors:no exception was thrown.
    
    test("verify spark.shuffle.accurateBlockThreshold configuration exception")
    current behaviors:no exception was thrown.
    ```
    do you agree with my suggestion? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/79361/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #91602 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/91602/testReport)** for PR 18555 at commit [`a0efb41`](https://github.com/apache/spark/commit/a0efb41fe041520bf423933b3c7b18ad2feb6711).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [Minor]add checkValue in spark.internal.config about how...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    ok to test


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79927 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79927/testReport)** for PR 18555 at commit [`364cfc4`](https://github.com/apache/spark/commit/364cfc4d6c8efc2a260d2a7cd578d9099851e3d2).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79639 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79639/testReport)** for PR 18555 at commit [`3983744`](https://github.com/apache/spark/commit/3983744794359f36b20111d5d7e60d042c027426).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126293375
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +62,63 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 512, 1g, 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    --- End diff --
    
    Also here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79609 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79609/testReport)** for PR 18555 at commit [`aa6c621`](https://github.com/apache/spark/commit/aa6c62122ba3091695d19c8915361e3b8a24f363).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by jerryshao <gi...@git.apache.org>.
Github user jerryshao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126293500
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -175,6 +246,8 @@ package object config {
         ConfigBuilder("spark.scheduler.listenerbus.metrics.maxListenerClassesTimed")
           .internal()
           .intConf
    +      .checkValue(v => v > 0,
    --- End diff --
    
    Please unify this style to one for all the checks if possible, like `_ > 0`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    **[Test build #79443 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/79443/testReport)** for PR 18555 at commit [`3df8feb`](https://github.com/apache/spark/commit/3df8feb21b091ac4c6d42f303bfce13d4468b190).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #18555: [Minor]add checkValue in spark.internal.config ab...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18555#discussion_r126274185
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -51,29 +63,66 @@ package object config {
         ConfigBuilder(SparkLauncher.EXECUTOR_EXTRA_LIBRARY_PATH).stringConf.createOptional
     
       private[spark] val EXECUTOR_USER_CLASS_PATH_FIRST =
    -    ConfigBuilder("spark.executor.userClassPathFirst").booleanConf.createWithDefault(false)
    +    ConfigBuilder("spark.executor.userClassPathFirst")
    +      .doc("Same functionality as spark.driver.userClassPathFirst, " +
    +        "but applied to executor instances.")
    +      .booleanConf
    +      .checkValues(Set(false, true))
    +      .createWithDefault(false)
     
       private[spark] val EXECUTOR_MEMORY = ConfigBuilder("spark.executor.memory")
    +    .doc("Amount of memory to use per executor process (e.g. 2g, 8g).")
         .bytesConf(ByteUnit.MiB)
    +    .checkValue(v => v > 0 && v <= Int.MaxValue / (1024 * 1024),
    +      "The executor memory must be greater than 0M " +
    +      s"and less than ${Int.MaxValue / (1024 * 1024)}M.")
         .createWithDefaultString("1g")
     
    -  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython").internal()
    -    .booleanConf.createWithDefault(false)
    +  private[spark] val IS_PYTHON_APP = ConfigBuilder("spark.yarn.isPython")
    +    .internal()
    +    .booleanConf
    +    .checkValues(Set(false, true))
    +    .createWithDefault(false)
     
    -  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus").intConf.createWithDefault(1)
    +  private[spark] val CPUS_PER_TASK = ConfigBuilder("spark.task.cpus")
    +    .doc("Number of cores to allocate for each task. ")
    +    .intConf
    +    .checkValue(v => v > 0,
    +      "Number of cores to allocate for task event queue must not be negative")
    +    .createWithDefault(1)
     
       private[spark] val DYN_ALLOCATION_MIN_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.minExecutors").intConf.createWithDefault(0)
    +    ConfigBuilder("spark.dynamicAllocation.minExecutors")
    +      .doc("Lower bound for the number of executors if dynamic allocation is enabled.")
    +      .intConf
    +      .checkValue(v => v >= 0,
    +        "Lower bound for the number of executors should be greater than or equal to 0")
    +      .createWithDefault(0)
     
       private[spark] val DYN_ALLOCATION_INITIAL_EXECUTORS =
         ConfigBuilder("spark.dynamicAllocation.initialExecutors")
    +      .doc("Initial number of executors to run if dynamic allocation is enabled. " +
    +        "If `--num-executors` (or `spark.executor.instances`) is set " +
    +        "and larger than this value, " +
    +        "it will be used as the initial number of executors.")
           .fallbackConf(DYN_ALLOCATION_MIN_EXECUTORS)
     
       private[spark] val DYN_ALLOCATION_MAX_EXECUTORS =
    -    ConfigBuilder("spark.dynamicAllocation.maxExecutors").intConf.createWithDefault(Int.MaxValue)
    +    ConfigBuilder("spark.dynamicAllocation.maxExecutors")
    +      .doc("Upper bound for the number of executors if dynamic allocation is enabled. ")
    +      .intConf
    +      .createWithDefault(Int.MaxValue)
    --- End diff --
    
    check if >= 0?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    @gatorsmile 
    Could you please review this code  again?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #18555: [SPARK-21353][CORE]add checkValue in spark.internal.conf...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/18555
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org