You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/03/16 22:55:05 UTC
[jira] [Updated] (SPARK-23894) Flaky Test:
BucketedWriteWithoutHiveSupportSuite
[ https://issues.apache.org/jira/browse/SPARK-23894?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-23894:
----------------------------------
Affects Version/s: (was: 3.0.0)
3.1.0
> Flaky Test: BucketedWriteWithoutHiveSupportSuite
> -------------------------------------------------
>
> Key: SPARK-23894
> URL: https://issues.apache.org/jira/browse/SPARK-23894
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.1.0
> Reporter: Imran Rashid
> Priority: Minor
> Attachments: unit-tests.log
>
>
> Flaky test observed here: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/88991/
> I'll attach a snippet of the unit-tests logs, for this suite and the preceeding one. Here's a snippet of the exception.
> {noformat}
> 08:36:34.694 Executor task launch worker for task 436 ERROR Executor: Exception in task 0.0 in stage 402.0 (TID 436)
> java.lang.IllegalStateException: LiveListenerBus is stopped.
> at org.apache.spark.scheduler.LiveListenerBus.addToQueue(LiveListenerBus.scala:97)
> at org.apache.spark.scheduler.LiveListenerBus.addToStatusQueue(LiveListenerBus.scala:80)
> at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:93)
> at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:117)
> at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:117)
> at scala.Option.getOrElse(Option.scala:121)
> at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:117)
> at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:116)
> at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:286)
> at org.apache.spark.sql.test.TestSparkSession.sessionState$lzycompute(TestSQLContext.scala:42)
> at org.apache.spark.sql.test.TestSparkSession.sessionState(TestSQLContext.scala:41)
> at org.apache.spark.sql.SparkSession$$anonfun$1$$anonfun$apply$1.apply(SparkSession.scala:92)
> at org.apache.spark.sql.SparkSession$$anonfun$1$$anonfun$apply$1.apply(SparkSession.scala:92)
> at scala.Option.map(Option.scala:146)
> at org.apache.spark.sql.SparkSession$$anonfun$1.apply(SparkSession.scala:92)
> at org.apache.spark.sql.SparkSession$$anonfun$1.apply(SparkSession.scala:91)
> at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:110)
> at org.apache.spark.sql.types.DataType.sameType(DataType.scala:84)
> at org.apache.spark.sql.catalyst.analysis.TypeCoercion$$anonfun$1.apply(TypeCoercion.scala:105)
> at org.apache.spark.sql.catalyst.analysis.TypeCoercion$$anonfun$1.apply(TypeCoercion.scala:86)
> {noformat}
> I doubt this is actually because of BucketedWriteWithoutHiveSupportSuite. I think it has something more to do with {{SparkSession}} 's lazy evaluation of {{SharedState}} doing something funny with the way we setup the test spark context etc ... though I don't really understand it yet.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org