You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/03/03 23:42:00 UTC

[jira] [Resolved] (SPARK-34604) Flaky test: TaskContextTestsWithWorkerReuse.test_task_context_correct_with_python_worker_reuse

     [ https://issues.apache.org/jira/browse/SPARK-34604?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-34604.
----------------------------------
    Fix Version/s: 3.1.2
                   3.0.3
                   3.2.0
       Resolution: Fixed

Issue resolved by pull request 31723
[https://github.com/apache/spark/pull/31723]

> Flaky test: TaskContextTestsWithWorkerReuse.test_task_context_correct_with_python_worker_reuse
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-34604
>                 URL: https://issues.apache.org/jira/browse/SPARK-34604
>             Project: Spark
>          Issue Type: Test
>          Components: PySpark
>    Affects Versions: 3.2.0
>            Reporter: Hyukjin Kwon
>            Assignee: Hyukjin Kwon
>            Priority: Major
>             Fix For: 3.2.0, 3.0.3, 3.1.2
>
>
> {code}
> ======================================================================
> ERROR [1.798s]: test_task_context_correct_with_python_worker_reuse (pyspark.tests.test_taskcontext.TaskContextTestsWithWorkerReuse)
> Verify the task context correct when reused python worker
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/home/jenkins/python/pyspark/tests/test_taskcontext.py", line 289, in test_task_context_correct_with_python_worker_reuse
>     self.assertTrue(pid in worker_pids)
> AssertionError: False is not true
> ----------------------------------------------------------------------
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org