You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/04/05 04:18:40 UTC

[spark] 03/08: [SPARK-30909][CORE][DOC] Add version information to the configuration of Python

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git

commit 6e04912c528038834501301587865c1a621112f7
Author: beliefer <be...@163.com>
AuthorDate: Thu Feb 27 10:57:34 2020 +0900

    [SPARK-30909][CORE][DOC] Add version information to the configuration of Python
    
    ### What changes were proposed in this pull request?
    1.Add version information to the configuration of `Python`.
    2.Update the docs of `Python`.
    
    I sorted out some information show below.
    
    Item name | Since version | JIRA ID | Commit ID | Note
    -- | -- | -- | -- | --
    spark.python.worker.reuse | 1.2.0 | SPARK-3030 | 2aea0da84c58a179917311290083456dfa043db7#diff-0a67bc4d171abe4df8eb305b0f4123a2 |  
    spark.python.task.killTimeout | 2.2.2 | SPARK-22535 | be68f86e11d64209d9e325ce807025318f383bea#diff-0a67bc4d171abe4df8eb305b0f4123a2 |  
    spark.python.use.daemon | 2.3.0 | SPARK-22554 | 57c5514de9dba1c14e296f85fb13fef23ce8c73f#diff-9008ad45db34a7eee2e265a50626841b |  
    spark.python.daemon.module | 2.4.0 | SPARK-22959 | afae8f2bc82597593595af68d1aa2d802210ea8b#diff-9008ad45db34a7eee2e265a50626841b |  
    spark.python.worker.module | 2.4.0 | SPARK-22959 | afae8f2bc82597593595af68d1aa2d802210ea8b#diff-9008ad45db34a7eee2e265a50626841b |  
    spark.executor.pyspark.memory | 2.4.0 | SPARK-25004 | 7ad18ee9f26e75dbe038c6034700f9cd4c0e2baa#diff-6bdad48cfc34314e89599655442ff210 |  
    
    ### Why are the changes needed?
    Supplemental configuration version information.
    
    ### Does this PR introduce any user-facing change?
    No
    
    ### How was this patch tested?
    Exists UT
    
    Closes #27704 from beliefer/add-version-to-python-config.
    
    Authored-by: beliefer <be...@163.com>
    Signed-off-by: HyukjinKwon <gu...@apache.org>
---
 core/src/main/scala/org/apache/spark/internal/config/Python.scala | 6 ++++++
 docs/configuration.md                                             | 6 ++++--
 2 files changed, 10 insertions(+), 2 deletions(-)

diff --git a/core/src/main/scala/org/apache/spark/internal/config/Python.scala b/core/src/main/scala/org/apache/spark/internal/config/Python.scala
index 26a0598..188d884 100644
--- a/core/src/main/scala/org/apache/spark/internal/config/Python.scala
+++ b/core/src/main/scala/org/apache/spark/internal/config/Python.scala
@@ -22,26 +22,32 @@ import org.apache.spark.network.util.ByteUnit
 
 private[spark] object Python {
   val PYTHON_WORKER_REUSE = ConfigBuilder("spark.python.worker.reuse")
+    .version("1.2.0")
     .booleanConf
     .createWithDefault(true)
 
   val PYTHON_TASK_KILL_TIMEOUT = ConfigBuilder("spark.python.task.killTimeout")
+    .version("2.2.2")
     .timeConf(TimeUnit.MILLISECONDS)
     .createWithDefaultString("2s")
 
   val PYTHON_USE_DAEMON = ConfigBuilder("spark.python.use.daemon")
+    .version("2.3.0")
     .booleanConf
     .createWithDefault(true)
 
   val PYTHON_DAEMON_MODULE = ConfigBuilder("spark.python.daemon.module")
+    .version("2.4.0")
     .stringConf
     .createOptional
 
   val PYTHON_WORKER_MODULE = ConfigBuilder("spark.python.worker.module")
+    .version("2.4.0")
     .stringConf
     .createOptional
 
   val PYSPARK_EXECUTOR_MEMORY = ConfigBuilder("spark.executor.pyspark.memory")
+    .version("2.4.0")
     .bytesConf(ByteUnit.MiB)
     .createOptional
 }
diff --git a/docs/configuration.md b/docs/configuration.md
index 4aff1ef..88edaf6 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -136,7 +136,7 @@ of the most common options to set are:
 ### Application Properties
 
 <table class="table">
-<tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
+<tr><th>Property Name</th><th>Default</th><th>Meaning</th><th>Since Version</th></tr>
 <tr>
   <td><code>spark.app.name</code></td>
   <td>(none)</td>
@@ -258,6 +258,7 @@ of the most common options to set are:
     limitations are inherited. For instance, Windows does not support resource limiting and actual 
     resource is not limited on MacOS.
   </td>
+  <td>2.4.0</td>
 </tr>
 <tr>
  <td><code>spark.executor.memoryOverhead</code></td>
@@ -420,7 +421,7 @@ Apart from these, the following properties are also available, and may be useful
 ### Runtime Environment
 
 <table class="table">
-<tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>
+<tr><th>Property Name</th><th>Default</th><th>Meaning</th><th>Since Version</th></tr>
 <tr>
   <td><code>spark.driver.extraClassPath</code></td>
   <td>(none)</td>
@@ -659,6 +660,7 @@ Apart from these, the following properties are also available, and may be useful
     if there is a large broadcast, then the broadcast will not need to be transferred
     from JVM to Python worker for every task.
   </td>
+  <td>1.2.0</td>
 </tr>
 <tr>
   <td><code>spark.files</code></td>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org