You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2014/12/15 23:35:13 UTC

[jira] [Resolved] (SPARK-4826) Possible flaky tests in WriteAheadLogBackedBlockRDDSuite: "java.lang.IllegalStateException: File exists and there is no append support!"

     [ https://issues.apache.org/jira/browse/SPARK-4826?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen resolved SPARK-4826.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 1.2.1
                   1.3.0

Issue resolved by pull request 3704
[https://github.com/apache/spark/pull/3704]

> Possible flaky tests in WriteAheadLogBackedBlockRDDSuite: "java.lang.IllegalStateException: File exists and there is no append support!"
> ----------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4826
>                 URL: https://issues.apache.org/jira/browse/SPARK-4826
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.2.0, 1.3.0
>            Reporter: Josh Rosen
>            Assignee: Tathagata Das
>              Labels: flaky-test
>             Fix For: 1.3.0, 1.2.1
>
>
> I saw a recent master Maven build failure in WriteHeadLogBackedBlockRDDSuite where four tests failed with the same exception.
> [Link to test result (this will eventually break)|https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-Maven-pre-YARN/1156/].  In case that link breaks:
> The failed tests:
> {code}
> org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.Read data available only in block manager, not in write ahead log
> org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.Read data available only in write ahead log, not in block manager
> org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.Read data available only in write ahead log, and test storing in block manager
> org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.Read data with partially available in block manager, and rest in write ahead log
> {code}
> The error messages are all (essentially) the same:
> {code}
>      java.lang.IllegalStateException: File exists and there is no append support!
>       at org.apache.spark.streaming.util.HdfsUtils$.getOutputStream(HdfsUtils.scala:33)
>       at org.apache.spark.streaming.util.WriteAheadLogWriter.org$apache$spark$streaming$util$WriteAheadLogWriter$$stream$lzycompute(WriteAheadLogWriter.scala:34)
>       at org.apache.spark.streaming.util.WriteAheadLogWriter.org$apache$spark$streaming$util$WriteAheadLogWriter$$stream(WriteAheadLogWriter.scala:34)
>       at org.apache.spark.streaming.util.WriteAheadLogWriter.<init>(WriteAheadLogWriter.scala:42)
>       at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.writeLogSegments(WriteAheadLogBackedBlockRDDSuite.scala:140)
>       at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.org$apache$spark$streaming$rdd$WriteAheadLogBackedBlockRDDSuite$$testRDD(WriteAheadLogBackedBlockRDDSuite.scala:95)
>       at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite$$anonfun$4.apply$mcV$sp(WriteAheadLogBackedBlockRDDSuite.scala:67)
>       at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite$$anonfun$4.apply(WriteAheadLogBackedBlockRDDSuite.scala:67)
>       at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite$$anonfun$4.apply(WriteAheadLogBackedBlockRDDSuite.scala:67)
>       at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
>       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>       at org.scalatest.Transformer.apply(Transformer.scala:22)
>       at org.scalatest.Transformer.apply(Transformer.scala:20)
>       at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
>       at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
>       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1555)
>       at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
>       at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
>       at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
>       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>       at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
>       at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
>       at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
>       at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
>       at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
>       at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
>       at scala.collection.immutable.List.foreach(List.scala:318)
>       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>       at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
>       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
>       at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
>       at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
>       at org.scalatest.Suite$class.run(Suite.scala:1424)
>       at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
>       at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
>       at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
>       at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
>       at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
>       at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.org$scalatest$BeforeAndAfterAll$$super$run(WriteAheadLogBackedBlockRDDSuite.scala:31)
>       at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
>       at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
>       at org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite.run(WriteAheadLogBackedBlockRDDSuite.scala:31)
>       at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492)
>       at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528)
>       at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1526)
>       at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>       at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
>       at org.scalatest.Suite$class.runNestedSuites(Suite.scala:1526)
>       at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:29)
>       at org.scalatest.Suite$class.run(Suite.scala:1421)
>       at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:29)
>       at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
>       at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
>       at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
>       at scala.collection.immutable.List.foreach(List.scala:318)
>       at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
>       at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
>       at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
>       at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
>       at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
>       at org.scalatest.tools.Runner$.main(Runner.scala:860)
>       at org.scalatest.tools.Runner.main(Runner.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org