You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (Jira)" <ji...@apache.org> on 2019/09/01 06:31:00 UTC

[jira] [Updated] (SPARK-28535) Flaky test: JobCancellationSuite."interruptible iterator of shuffle reader"

     [ https://issues.apache.org/jira/browse/SPARK-28535?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li updated SPARK-28535:
----------------------------
    Issue Type: Test  (was: Bug)

> Flaky test: JobCancellationSuite."interruptible iterator of shuffle reader"
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-28535
>                 URL: https://issues.apache.org/jira/browse/SPARK-28535
>             Project: Spark
>          Issue Type: Test
>          Components: Tests
>    Affects Versions: 2.3.3, 3.0.0, 2.4.3
>            Reporter: Marcelo Vanzin
>            Assignee: Marcelo Vanzin
>            Priority: Minor
>             Fix For: 2.3.4, 2.4.4, 3.0.0
>
>
> This is the same flakiness as in SPARK-23881, except the fix there didn't really take, at least on our build machines.
> {noformat}
> org.scalatest.exceptions.TestFailedException: 10000 was not less than 10000
>       at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
>       at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
>       at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:501)
> {noformat}
> Since that bug is short on explanations, the issue is that there's a race between the thread posting the "stage completed" event to the listener which unblocks the test, and the thread killing the task in the executor. If the even arrives first, it will unblock task execution, and there's a chance that all elements will actually be processed before the executor has a chance to stop the task.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org