You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Imran Rashid (JIRA)" <ji...@apache.org> on 2018/11/20 18:32:00 UTC
[jira] [Resolved] (SPARK-26079) Flaky test:
StreamingQueryListenersConfSuite
[ https://issues.apache.org/jira/browse/SPARK-26079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Imran Rashid resolved SPARK-26079.
----------------------------------
Resolution: Fixed
Assignee: Marcelo Vanzin
Fix Version/s: 3.0.0
2.4.1
fixed by https://github.com/apache/spark/pull/23050
> Flaky test: StreamingQueryListenersConfSuite
> --------------------------------------------
>
> Key: SPARK-26079
> URL: https://issues.apache.org/jira/browse/SPARK-26079
> Project: Spark
> Issue Type: Bug
> Components: SQL, Tests
> Affects Versions: 2.4.0
> Reporter: Marcelo Vanzin
> Assignee: Marcelo Vanzin
> Priority: Minor
> Fix For: 2.4.1, 3.0.0
>
>
> We've had this test fail a few times in our builds.
> {noformat}
> org.scalatest.exceptions.TestFailedException: null equaled null
> at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
> at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
> at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:501)
> at org.apache.spark.sql.streaming.StreamingQueryListenersConfSuite$$anonfun$1.apply(StreamingQueryListenersConfSuite.scala:45)
> at org.apache.spark.sql.streaming.StreamingQueryListenersConfSuite$$anonfun$1.apply(StreamingQueryListenersConfSuite.scala:38)
> at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> at org.scalatest.Transformer.apply(Transformer.scala:22)
> at org.scalatest.Transformer.apply(Transformer.scala:20)
> at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> {noformat}
> You can reproduce it reliably by adding a sleep in the test listener. Fix coming up.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org