You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/24 20:50:16 UTC

[jira] [Created] (SPARK-16193) Address flaky ExternalAppendOnlyMapSuite spilling tests

Sean Owen created SPARK-16193:
---------------------------------

             Summary: Address flaky ExternalAppendOnlyMapSuite spilling tests
                 Key: SPARK-16193
                 URL: https://issues.apache.org/jira/browse/SPARK-16193
             Project: Spark
          Issue Type: Bug
          Components: Spark Core, Tests
    Affects Versions: 2.0.0
            Reporter: Sean Owen
            Assignee: Sean Owen
            Priority: Minor


We've seen tests fail, for different codecs and operations, like so, most recently from 2.0.0 RC1:

{code}
- spilling with compression *** FAILED ***
  java.lang.Exception: Test failed with compression using codec org.apache.spark.io.LZ4CompressionCodec:

assertion failed: expected groupByKey to spill, but did not
  at scala.Predef$.assert(Predef.scala:170)
  at org.apache.spark.TestUtils$.assertSpilled(TestUtils.scala:170)
  at org.apache.spark.util.collection.ExternalAppendOnlyMapSuite.org$apache$spark$util$collection$ExternalAppendOnlyMapSuite$$testSimpleSpilling(ExternalAppendOnlyMapSuite.scala:253)
  at org.apache.spark.util.collection.ExternalAppendOnlyMapSuite$$anonfun$10$$anonfun$apply$mcV$sp$8.apply(ExternalAppendOnlyMapSuite.scala:218)
  at org.apache.spark.util.collection.ExternalAppendOnlyMapSuite$$anonfun$10$$anonfun$apply$mcV$sp$8.apply(ExternalAppendOnlyMapSuite.scala:216)
  at scala.collection.immutable.Stream.foreach(Stream.scala:594)
  ...
{code}

My theory is that the listener doesn't receive notification of the spilled stages early enough to show up in the test. It should wait until the job is done before letting the test proceed to query the number of spilled stages.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org