You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/01/10 13:41:00 UTC
[jira] [Reopened] (SPARK-30480) Pyspark test "test_memory_limit"
fails consistently
[ https://issues.apache.org/jira/browse/SPARK-30480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon reopened SPARK-30480:
----------------------------------
Reverted at [https://github.com/apache/spark/commit/d0983af38ffb123fa440bc5fcf3912db7658dd28]
> Pyspark test "test_memory_limit" fails consistently
> ---------------------------------------------------
>
> Key: SPARK-30480
> URL: https://issues.apache.org/jira/browse/SPARK-30480
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 3.0.0
> Reporter: Jungtaek Lim
> Priority: Major
> Fix For: 3.0.0
>
>
> I'm seeing consistent pyspark test failures on multiple PRs ([#26955|https://github.com/apache/spark/pull/26955], [#26201|https://github.com/apache/spark/pull/26201], [#27064|https://github.com/apache/spark/pull/27064]), and they all failed from "test_memory_limit".
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116422/testReport]
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116438/testReport]
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116429/testReport]
> [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/116366/testReport]
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org