You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (JIRA)" <ji...@apache.org> on 2019/01/30 15:15:00 UTC
[jira] [Resolved] (SPARK-26732) Flaky test:
SparkContextInfoSuite.getRDDStorageInfo only reports on RDDs that actually
persist data
[ https://issues.apache.org/jira/browse/SPARK-26732?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Takeshi Yamamuro resolved SPARK-26732.
--------------------------------------
Resolution: Fixed
Assignee: Marcelo Vanzin
Fix Version/s: 3.0.0
2.4.1
2.3.3
Resolved by https://github.com/apache/spark/pull/23654
> Flaky test: SparkContextInfoSuite.getRDDStorageInfo only reports on RDDs that actually persist data
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-26732
> URL: https://issues.apache.org/jira/browse/SPARK-26732
> Project: Spark
> Issue Type: Bug
> Components: Tests
> Affects Versions: 2.3.2, 2.4.0, 3.0.0
> Reporter: Marcelo Vanzin
> Assignee: Marcelo Vanzin
> Priority: Major
> Fix For: 2.3.3, 2.4.1, 3.0.0
>
>
> From https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7/5437/testReport/junit/org.apache.spark/SparkContextInfoSuite/getRDDStorageInfo_only_reports_on_RDDs_that_actually_persist_data/:
> {noformat}
> Error Message
> org.scalatest.exceptions.TestFailedException: 0 did not equal 1
> Stacktrace
> sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: 0 did not equal 1
> at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:528)
> at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:527)
> at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
> at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:501)
> at org.apache.spark.SparkContextInfoSuite.$anonfun$new$3(SparkContextInfoSuite.scala:63)
> {noformat}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org