You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2021/11/09 02:38:50 UTC

[spark] branch branch-3.2 updated: [SPARK-37252][PYTHON][TESTS] Ignore `test_memory_limit` on non-Linux environment

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.2 by this push:
     new c4090fc  [SPARK-37252][PYTHON][TESTS] Ignore `test_memory_limit` on non-Linux environment
c4090fc is described below

commit c4090fce1e8a68aef5a4ec40588764b25109d29f
Author: Dongjoon Hyun <do...@apache.org>
AuthorDate: Mon Nov 8 18:37:03 2021 -0800

    [SPARK-37252][PYTHON][TESTS] Ignore `test_memory_limit` on non-Linux environment
    
    ### What changes were proposed in this pull request?
    
    This PR aims to ignore `test_memory_limit` on non-Linux environment.
    
    ### Why are the changes needed?
    
    Like the documentation https://github.com/apache/spark/pull/23664, it fails on non-Linux environment like the following MacOS example.
    
    **BEFORE**
    ```
    $ build/sbt -Phadoop-cloud -Phadoop-3.2 test:package
    $ python/run-tests --modules pyspark-core
    ...
    ======================================================================
    FAIL: test_memory_limit (pyspark.tests.test_worker.WorkerMemoryTest)
    ----------------------------------------------------------------------
    Traceback (most recent call last):
      File "/Users/dongjoon/APACHE/spark-merge/python/pyspark/tests/test_worker.py", line 212, in test_memory_limit
        self.assertEqual(soft_limit, 2 * 1024 * 1024 * 1024)
    AssertionError: 9223372036854775807 != 2147483648
    
    ----------------------------------------------------------------------
    ```
    
    **AFTER**
    ```
    ...
    Tests passed in 104 seconds
    
    Skipped tests in pyspark.tests.test_serializers with /Users/dongjoon/.pyenv/versions/3.8.12/bin/python3:
        test_serialize (pyspark.tests.test_serializers.SciPyTests) ... skipped 'SciPy not installed'
    
    Skipped tests in pyspark.tests.test_worker with /Users/dongjoon/.pyenv/versions/3.8.12/bin/python3:
        test_memory_limit (pyspark.tests.test_worker.WorkerMemoryTest) ... skipped "Memory limit feature in Python worker is dependent on Python's 'resource' module on Linux; however, not found or not on Linux."
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No
    
    ### How was this patch tested?
    
    Manual.
    
    Closes #34527 from dongjoon-hyun/SPARK-37252.
    
    Authored-by: Dongjoon Hyun <do...@apache.org>
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
    (cherry picked from commit 2c7f20151e99c212443a1f8762350d0a96a26440)
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
 python/pyspark/tests/test_worker.py | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/python/pyspark/tests/test_worker.py b/python/pyspark/tests/test_worker.py
index ebf1e93..b9b7306 100644
--- a/python/pyspark/tests/test_worker.py
+++ b/python/pyspark/tests/test_worker.py
@@ -16,6 +16,7 @@
 # limitations under the License.
 #
 import os
+import sys
 import tempfile
 import threading
 import time
@@ -188,9 +189,9 @@ class WorkerReuseTest(PySparkTestCase):
 
 
 @unittest.skipIf(
-    not has_resource_module,
+    not has_resource_module or sys.platform != 'linux',
     "Memory limit feature in Python worker is dependent on "
-    "Python's 'resource' module; however, not found.")
+    "Python's 'resource' module on Linux; however, not found or not on Linux.")
 class WorkerMemoryTest(unittest.TestCase):
 
     def setUp(self):

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org