You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/06/17 03:16:05 UTC

[spark] branch branch-3.0 updated: [SPARK-32011][PYTHON][CORE] Remove warnings about pin-thread modes and guide to use collectWithJobGroup

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new 83d928b  [SPARK-32011][PYTHON][CORE] Remove warnings about pin-thread modes and guide to use collectWithJobGroup
83d928b is described below

commit 83d928b9805001e6deba3f1599b8b18dc6c857dd
Author: HyukjinKwon <gu...@apache.org>
AuthorDate: Wed Jun 17 12:10:12 2020 +0900

    [SPARK-32011][PYTHON][CORE] Remove warnings about pin-thread modes and guide to use collectWithJobGroup
    
    ### What changes were proposed in this pull request?
    
    This PR proposes to remove the warning about multi-thread in local properties, and change the guide to use `collectWithJobGroup` for multi-threads for now because:
    - It is too noisy to users who don't use multiple threads - the number of this single thread case is arguably more prevailing.
    - There was a critical issue found about pin-thread mode SPARK-32010, which will be fixed in Spark 3.1.
    - To smoothly migrate, `RDD.collectWithJobGroup` was added, which will be deprecated in Spark 3.1 with SPARK-32010 fixed.
    
    I will target to deprecate `RDD.collectWithJobGroup`, and make this pin-thread mode stable in Spark 3.1. In the future releases, I plan to make this mode as a default mode, and remove `RDD.collectWithJobGroup` away.
    
    ### Why are the changes needed?
    
    To avoid guiding users a feature with a critical issue, and provide a proper workaround for now.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, warning message and documentation.
    
    ### How was this patch tested?
    
    Manually tested:
    
    Before:
    
    ```
    >>> spark.sparkContext.setLocalProperty("a", "b")
    /.../spark/python/pyspark/util.py:141: UserWarning: Currently, 'setLocalProperty' (set to local
    properties) with multiple threads does not properly work.
    Internally threads on PVM and JVM are not synced, and JVM thread can be reused for multiple
    threads on PVM, which fails to isolate local properties for each thread on PVM.
    To work around this, you can set PYSPARK_PIN_THREAD to true (see SPARK-22340). However,
    note that it cannot inherit the local properties from the parent thread although it isolates each
    thread on PVM and JVM with its own local properties.
    To work around this, you should manually copy and set the local properties from the parent thread
     to the child thread when you create another thread.
    ```
    
    After:
    ```
    >>> spark.sparkContext.setLocalProperty("a", "b")
    ```
    
    Closes #28845 from HyukjinKwon/SPARK-32011.
    
    Authored-by: HyukjinKwon <gu...@apache.org>
    Signed-off-by: HyukjinKwon <gu...@apache.org>
    (cherry picked from commit feeca63198466640ac461a2a34922493fa6162a8)
    Signed-off-by: HyukjinKwon <gu...@apache.org>
---
 python/pyspark/context.py | 37 ++++++-------------------------------
 python/pyspark/util.py    | 27 ---------------------------
 2 files changed, 6 insertions(+), 58 deletions(-)

diff --git a/python/pyspark/context.py b/python/pyspark/context.py
index 81b6caa..5bb991e 100644
--- a/python/pyspark/context.py
+++ b/python/pyspark/context.py
@@ -41,7 +41,6 @@ from pyspark.rdd import RDD, _load_from_socket, ignore_unicode_prefix
 from pyspark.traceback_utils import CallSite, first_spark_call
 from pyspark.status import StatusTracker
 from pyspark.profiler import ProfilerCollector, BasicProfiler
-from pyspark.util import _warn_pin_thread
 
 if sys.version > '3':
     xrange = range
@@ -1026,17 +1025,9 @@ class SparkContext(object):
         .. note:: Currently, setting a group ID (set to local properties) with multiple threads
             does not properly work. Internally threads on PVM and JVM are not synced, and JVM
             thread can be reused for multiple threads on PVM, which fails to isolate local
-            properties for each thread on PVM.
-
-            To work around this, you can set `PYSPARK_PIN_THREAD` to
-            `'true'` (see SPARK-22340). However, note that it cannot inherit the local properties
-            from the parent thread although it isolates each thread on PVM and JVM with its own
-            local properties.
-
-            To work around this, you should manually copy and set the local
-            properties from the parent thread to the child thread when you create another thread.
+            properties for each thread on PVM. To work around this, You can use
+            :meth:`RDD.collectWithJobGroup` for now.
         """
-        _warn_pin_thread("setJobGroup")
         self._jsc.setJobGroup(groupId, description, interruptOnCancel)
 
     def setLocalProperty(self, key, value):
@@ -1047,17 +1038,9 @@ class SparkContext(object):
         .. note:: Currently, setting a local property with multiple threads does not properly work.
             Internally threads on PVM and JVM are not synced, and JVM thread
             can be reused for multiple threads on PVM, which fails to isolate local properties
-            for each thread on PVM.
-
-            To work around this, you can set `PYSPARK_PIN_THREAD` to
-            `'true'` (see SPARK-22340). However, note that it cannot inherit the local properties
-            from the parent thread although it isolates each thread on PVM and JVM with its own
-            local properties.
-
-            To work around this, you should manually copy and set the local
-            properties from the parent thread to the child thread when you create another thread.
+            for each thread on PVM. To work around this, You can use
+            :meth:`RDD.collectWithJobGroup`.
         """
-        _warn_pin_thread("setLocalProperty")
         self._jsc.setLocalProperty(key, value)
 
     def getLocalProperty(self, key):
@@ -1074,17 +1057,9 @@ class SparkContext(object):
         .. note:: Currently, setting a job description (set to local properties) with multiple
             threads does not properly work. Internally threads on PVM and JVM are not synced,
             and JVM thread can be reused for multiple threads on PVM, which fails to isolate
-            local properties for each thread on PVM.
-
-            To work around this, you can set `PYSPARK_PIN_THREAD` to
-            `'true'` (see SPARK-22340). However, note that it cannot inherit the local properties
-            from the parent thread although it isolates each thread on PVM and JVM with its own
-            local properties.
-
-            To work around this, you should manually copy and set the local
-            properties from the parent thread to the child thread when you create another thread.
+            local properties for each thread on PVM. To work around this, You can use
+            :meth:`RDD.collectWithJobGroup` for now.
         """
-        _warn_pin_thread("setJobDescription")
         self._jsc.setJobDescription(value)
 
     def sparkUser(self):
diff --git a/python/pyspark/util.py b/python/pyspark/util.py
index 9313756..6e53e57 100644
--- a/python/pyspark/util.py
+++ b/python/pyspark/util.py
@@ -114,33 +114,6 @@ def fail_on_stopiteration(f):
     return wrapper
 
 
-def _warn_pin_thread(name):
-    if os.environ.get("PYSPARK_PIN_THREAD", "false").lower() == "true":
-        msg = (
-            "PYSPARK_PIN_THREAD feature is enabled. "
-            "However, note that it cannot inherit the local properties from the parent thread "
-            "although it isolates each thread on PVM and JVM with its own local properties. "
-            "\n"
-            "To work around this, you should manually copy and set the local properties from "
-            "the parent thread to the child thread when you create another thread.")
-    else:
-        msg = (
-            "Currently, '%s' (set to local properties) with multiple threads does "
-            "not properly work. "
-            "\n"
-            "Internally threads on PVM and JVM are not synced, and JVM thread can be reused "
-            "for multiple threads on PVM, which fails to isolate local properties for each "
-            "thread on PVM. "
-            "\n"
-            "To work around this, you can set PYSPARK_PIN_THREAD to true (see SPARK-22340). "
-            "However, note that it cannot inherit the local properties from the parent thread "
-            "although it isolates each thread on PVM and JVM with its own local properties. "
-            "\n"
-            "To work around this, you should manually copy and set the local properties from "
-            "the parent thread to the child thread when you create another thread." % name)
-    warnings.warn(msg, UserWarning)
-
-
 def _print_missing_jar(lib_name, pkg_name, jar_name, spark_version):
     print("""
 ________________________________________________________________________________________________


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org