You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/03/02 14:50:00 UTC

[jira] [Created] (SPARK-27032) Flaky test: org.apache.spark.sql.execution.streaming.HDFSMetadataLogSuite.HDFSMetadataLog: metadata directory collision

Sean Owen created SPARK-27032:
---------------------------------

             Summary: Flaky test: org.apache.spark.sql.execution.streaming.HDFSMetadataLogSuite.HDFSMetadataLog: metadata directory collision
                 Key: SPARK-27032
                 URL: https://issues.apache.org/jira/browse/SPARK-27032
             Project: Spark
          Issue Type: Test
          Components: Spark Core, Tests
    Affects Versions: 3.0.0
            Reporter: Sean Owen
            Assignee: Sean Owen


Locally and on Jenkins, I've frequently seen this test fail:

{code}
Error Message
The await method on Waiter timed out.
Stacktrace
      org.scalatest.exceptions.TestFailedException: The await method on Waiter timed out.
      at org.scalatest.concurrent.Waiters$Waiter.awaitImpl(Waiters.scala:406)
      at org.scalatest.concurrent.Waiters$Waiter.await(Waiters.scala:540)
      at org.apache.spark.sql.execution.streaming.HDFSMetadataLogSuite.$anonfun$new$19(HDFSMetadataLogSuite.scala:158)
      at org.apache.spark.sql.execution.streaming.HDFSMetadataLogSuite.$anonfun$new$19$adapted(HDFSMetadataLogSuite.scala:133)
...
{code}

See for example https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-maven-hadoop-2.7/6057/testReport/

There aren't obvious errors or problems with the test. Because it passes sometimes, my guess is that the timeout is simply too short or the test too long. I'd like to try reducing the number of threads/batches in the test.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org