You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2021/11/09 02:41:00 UTC

[jira] [Resolved] (SPARK-37252) Ignore test_memory_limit on non-Linux environment

     [ https://issues.apache.org/jira/browse/SPARK-37252?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun resolved SPARK-37252.
-----------------------------------
    Fix Version/s: 3.1.3
                   3.2.1
                   3.3.0
         Assignee: Dongjoon Hyun
       Resolution: Fixed

This is resolved via https://github.com/apache/spark/pull/34527

> Ignore test_memory_limit on non-Linux environment
> -------------------------------------------------
>
>                 Key: SPARK-37252
>                 URL: https://issues.apache.org/jira/browse/SPARK-37252
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, Tests
>    Affects Versions: 3.2.0, 3.3.0
>            Reporter: Dongjoon Hyun
>            Assignee: Dongjoon Hyun
>            Priority: Minor
>             Fix For: 3.1.3, 3.2.1, 3.3.0
>
>
> {code}
> $ build/sbt -Phadoop-cloud -Phadoop-3.2 test:package
> $ python/run-tests
> ...
> ======================================================================
> FAIL: test_memory_limit (pyspark.tests.test_worker.WorkerMemoryTest)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/Users/dongjoon/APACHE/spark-merge/python/pyspark/tests/test_worker.py", line 212, in test_memory_limit
>     self.assertEqual(soft_limit, 2 * 1024 * 1024 * 1024)
> AssertionError: 9223372036854775807 != 2147483648
> ----------------------------------------------------------------------
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org