You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/10/08 05:42:13 UTC
[jira] [Resolved] (SPARK-24245) Flaky test:
KafkaContinuousSinkSuite
[ https://issues.apache.org/jira/browse/SPARK-24245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-24245.
----------------------------------
Resolution: Incomplete
> Flaky test: KafkaContinuousSinkSuite
> ------------------------------------
>
> Key: SPARK-24245
> URL: https://issues.apache.org/jira/browse/SPARK-24245
> Project: Spark
> Issue Type: Bug
> Components: Structured Streaming, Tests
> Affects Versions: 2.3.1
> Reporter: Marcelo Masiero Vanzin
> Priority: Major
> Labels: bulk-closed
>
> This test seems to be broken or flaky. From jenkins:
> https://amplab.cs.berkeley.edu/jenkins/user/vanzin/my-views/view/Spark/job/spark-branch-2.3-test-maven-hadoop-2.6/367/
> {noformat}
> Caused by: org.scalatest.exceptions.TestFailedException: -1 did not equal 0
> at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
> at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
> at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:501)
> at org.apache.spark.sql.kafka010.KafkaContinuousTest$$anonfun$afterEach$1.apply(KafkaContinuousTest.scala:76)
> at org.apache.spark.sql.kafka010.KafkaContinuousTest$$anonfun$afterEach$1.apply(KafkaContinuousTest.scala:76)
> at org.scalatest.concurrent.Eventually$class.makeAValiantAttempt$1(Eventually.scala:395)
> at org.scalatest.concurrent.Eventually$class.tryTryAgain$1(Eventually.scala:409)
> ... 54 more
> SUITE ABORTED - KafkaContinuousSinkSuite: The code passed to eventually never returned normally. Attempted 1990 times over 30.007800773 seconds. Last failure message: -1 did not equal 0.
> {noformat}
> We should fix it or disable it in 2.3.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org