You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2015/07/09 03:24:05 UTC

[jira] [Commented] (SPARK-7419) Flaky test: o.a.s.streaming.CheckpointSuite

    [ https://issues.apache.org/jira/browse/SPARK-7419?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14619701#comment-14619701 ] 

Josh Rosen commented on SPARK-7419:
-----------------------------------

[~tdas], it looks like SPARK-1600 has started re-occurring too.

org.apache.spark.streaming.CheckpointSuite.recovery with file input stream failed recently:

{code}
Set(10, 1, 6, 21, 45, 3, 36, 15) did not equal Set(10, 1, 6, 28, 21, 45, 3, 36, 15)
{code}

https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/2886/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.3,label=centos/testReport/junit/org.apache.spark.streaming/CheckpointSuite/recovery_with_file_input_stream/

> Flaky test: o.a.s.streaming.CheckpointSuite
> -------------------------------------------
>
>                 Key: SPARK-7419
>                 URL: https://issues.apache.org/jira/browse/SPARK-7419
>             Project: Spark
>          Issue Type: Bug
>          Components: Tests
>            Reporter: Andrew Or
>            Assignee: Tathagata Das
>            Priority: Critical
>              Labels: flaky-test
>
> Failing with error messages like
> {code}
> 5 did not equal 7 Number of outputs do not match
> {code}
> Various tests in the suite seem to be failing with similar error messages:
> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.3,label=centos/2228/
> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.0,label=centos/2230/



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org