You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jungtaek Lim (Jira)" <ji...@apache.org> on 2022/07/14 12:41:00 UTC

[jira] [Updated] (SPARK-39622) ParquetIOSuite fails intermittently on master branch

     [ https://issues.apache.org/jira/browse/SPARK-39622?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jungtaek Lim updated SPARK-39622:
---------------------------------
    Summary: ParquetIOSuite fails intermittently on master branch  (was: ParquetIOSuite fails consistently on master branch)

> ParquetIOSuite fails intermittently on master branch
> ----------------------------------------------------
>
>                 Key: SPARK-39622
>                 URL: https://issues.apache.org/jira/browse/SPARK-39622
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Jungtaek Lim
>            Priority: Major
>
> "SPARK-7837 Do not close output writer twice when commitTask() fails" in ParquetIOSuite fails consistently with master branch. 
> Assertion error follows:
> {code}
> "Job aborted due to stage failure: Authorized committer (attemptNumber=0, stage=1, partition=0) failed; but task commit success, data duplication may happen." did not contain "Intentional exception for testing purposes"
> ScalaTestFailureLocation: org.apache.spark.sql.execution.datasources.parquet.ParquetIOSuite at (ParquetIOSuite.scala:1216)
> org.scalatest.exceptions.TestFailedException: "Job aborted due to stage failure: Authorized committer (attemptNumber=0, stage=1, partition=0) failed; but task commit success, data duplication may happen." did not contain "Intentional exception for testing purposes"
> 	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
> 	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
> 	at org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
> 	at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetIOSuite.$anonfun$new$259(ParquetIOSuite.scala:1216)
> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetIOSuite.$anonfun$new$259$adapted(ParquetIOSuite.scala:1209)
> 	at org.apache.spark.sql.catalyst.plans.SQLHelper.withTempPath(SQLHelper.scala:69)
> 	at org.apache.spark.sql.catalyst.plans.SQLHelper.withTempPath$(SQLHelper.scala:66)
> 	at org.apache.spark.sql.QueryTest.withTempPath(QueryTest.scala:33)
> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetIOSuite.$anonfun$new$256(ParquetIOSuite.scala:1209)
> 	at org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf(SQLHelper.scala:54)
> 	at org.apache.spark.sql.catalyst.plans.SQLHelper.withSQLConf$(SQLHelper.scala:38)
> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetIOSuite.org$apache$spark$sql$test$SQLTestUtilsBase$$super$withSQLConf(ParquetIOSuite.scala:56)
> 	at org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf(SQLTestUtils.scala:247)
> 	at org.apache.spark.sql.test.SQLTestUtilsBase.withSQLConf$(SQLTestUtils.scala:245)
> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetIOSuite.withSQLConf(ParquetIOSuite.scala:56)
> 	at org.apache.spark.sql.execution.datasources.parquet.ParquetIOSuite.$anonfun$new$255(ParquetIOSuite.scala:1190)
> 	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
> 	at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
> 	at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
> 	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> 	at org.scalatest.Transformer.apply(Transformer.scala:22)
> 	at org.scalatest.Transformer.apply(Transformer.scala:20)
> 	at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
> 	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:203)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200)
> 	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182)
> 	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:64)
> 	at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
> 	at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
> 	at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:64)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:233)
> 	at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
> 	at scala.collection.immutable.List.foreach(List.scala:431)
> 	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
> 	at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
> 	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:233)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:232)
> 	at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
> 	at org.scalatest.Suite.run(Suite.scala:1112)
> 	at org.scalatest.Suite.run$(Suite.scala:1094)
> 	at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:237)
> 	at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:237)
> 	at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:236)
> 	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:64)
> 	at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
> 	at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
> 	at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
> 	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:64)
> 	at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
> 	at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1320)
> 	at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1314)
> 	at scala.collection.immutable.List.foreach(List.scala:431)
> 	at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1314)
> 	at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
> 	at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
> 	at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1480)
> 	at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
> 	at org.scalatest.tools.Runner$.run(Runner.scala:798)
> 	at org.scalatest.tools.Runner.run(Runner.scala)
> 	at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2or3(ScalaTestRunner.java:38)
> 	at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:25)
> {code}
> It seems to produce the expected exception message in first query, but seems no longer produce the expected exception message in second query.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org