You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2018/10/31 20:09:00 UTC

[jira] [Created] (SPARK-25903) Flaky test: BarrierTaskContextSuite.throw exception on barrier() call timeout

Marcelo Vanzin created SPARK-25903:
--------------------------------------

             Summary: Flaky test: BarrierTaskContextSuite.throw exception on barrier() call timeout
                 Key: SPARK-25903
                 URL: https://issues.apache.org/jira/browse/SPARK-25903
             Project: Spark
          Issue Type: Bug
          Components: Tests
    Affects Versions: 2.4.0
            Reporter: Marcelo Vanzin


We hit this in our internal builds.

{noformat}
Expected exception org.apache.spark.SparkException to be thrown, but no exception was thrown
Stacktrace
      org.scalatest.exceptions.TestFailedException: Expected exception org.apache.spark.SparkException to be thrown, but no exception was thrown
      at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
      at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
      at org.scalatest.Assertions$class.intercept(Assertions.scala:822)
      at org.scalatest.FunSuite.intercept(FunSuite.scala:1560)
      at org.apache.spark.scheduler.BarrierTaskContextSuite$$anonfun$7.apply(BarrierTaskContextSuite.scala:94)
      at org.apache.spark.scheduler.BarrierTaskContextSuite$$anonfun$7.apply(BarrierTaskContextSuite.scala:76)
      at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
{noformat}

The problem from the logs is that the first task to call {{barrier()}} took a while, and at that point the "slow" task was already running, so the sleep finishes before the 2s timeout runs out.




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org