You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2019/01/01 05:55:00 UTC
[jira] [Updated] (SPARK-25903) Flaky test:
BarrierTaskContextSuite.throw exception on barrier() call timeout
[ https://issues.apache.org/jira/browse/SPARK-25903?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-25903:
----------------------------------
Affects Version/s: 3.0.0
> Flaky test: BarrierTaskContextSuite.throw exception on barrier() call timeout
> -----------------------------------------------------------------------------
>
> Key: SPARK-25903
> URL: https://issues.apache.org/jira/browse/SPARK-25903
> Project: Spark
> Issue Type: Bug
> Components: Tests
> Affects Versions: 2.4.0, 3.0.0
> Reporter: Marcelo Vanzin
> Priority: Minor
>
> We hit this in our internal builds.
> {noformat}
> Expected exception org.apache.spark.SparkException to be thrown, but no exception was thrown
> Stacktrace
> org.scalatest.exceptions.TestFailedException: Expected exception org.apache.spark.SparkException to be thrown, but no exception was thrown
> at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
> at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
> at org.scalatest.Assertions$class.intercept(Assertions.scala:822)
> at org.scalatest.FunSuite.intercept(FunSuite.scala:1560)
> at org.apache.spark.scheduler.BarrierTaskContextSuite$$anonfun$7.apply(BarrierTaskContextSuite.scala:94)
> at org.apache.spark.scheduler.BarrierTaskContextSuite$$anonfun$7.apply(BarrierTaskContextSuite.scala:76)
> at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> {noformat}
> The problem from the logs is that the first task to call {{barrier()}} took a while, and at that point the "slow" task was already running, so the sleep finishes before the 2s timeout runs out.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org