You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "LuciferYang (via GitHub)" <gi...@apache.org> on 2023/09/20 05:42:38 UTC

[GitHub] [spark] LuciferYang commented on pull request #43005: [WIP][SPARK-44112][BUILD][INFRA] Drop Java 8 and 11 support

LuciferYang commented on PR #43005:
URL: https://github.com/apache/spark/pull/43005#issuecomment-1727006590

   @dongjoon-hyun I'd like to discuss with you that https://github.com/LuciferYang/spark/actions/runs/6243680485/job/16949769483 has tested Java 17 + Scala 2.12.18, and a lot of tests failed after changing `-target:jvm-1.8` to `-target:17`. 
   
   I can reproduce these errors locally, 
   
   ```
   java -version
   openjdk version "17.0.8" 2023-07-18 LTS
   OpenJDK Runtime Environment Zulu17.44+15-CA (build 17.0.8+7-LTS)
   OpenJDK 64-Bit Server VM Zulu17.44+15-CA (build 17.0.8+7-LTS, mixed mode, sharing)
   ```
   
   ```
   build/sbt clean "catalyst/test" -Pscala-2.12
   ```
   
   ```
   [info] LimitPushdownSuite:
   [info] org.apache.spark.sql.catalyst.optimizer.LimitPushdownSuite *** ABORTED *** (0 milliseconds)
   [info]   java.lang.IllegalAccessError: Update to non-static final field org.apache.spark.SparkFunSuite.invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected attempted from a different method (org$scalatest$BeforeAndAfterAll$_setter_$invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected_$eq) than the initializer method <init>
   [info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$_setter_$invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected_$eq(SparkFunSuite.scala:69)
   [info]   at org.scalatest.BeforeAndAfterAll.$init$(BeforeAndAfterAll.scala:148)
   [info]   at org.apache.spark.SparkFunSuite.<init>(SparkFunSuite.scala:70)
   [info]   at org.apache.spark.sql.catalyst.optimizer.LimitPushdownSuite.<init>(LimitPushdownSuite.scala:29)
   [info]   at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   [info]   at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
   [info]   at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   [info]   at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
   [info]   at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
   [info]   at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
   [info]   at java.base/java.lang.Class.newInstance(Class.java:645)
   [info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:454)
   [info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
   [info]   at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
   [info]   at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
   [info]   at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
   [info]   at java.base/java.lang.Thread.run(Thread.java:833)
   [error] Uncaught exception when running org.apache.spark.sql.catalyst.optimizer.LimitPushdownSuite: java.lang.IllegalAccessError: Update to non-static final field org.apache.spark.SparkFunSuite.invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected attempted from a different method (org$scalatest$BeforeAndAfterAll$_setter_$invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected_$eq) than the initializer method <init> 
   [error] sbt.ForkMain$ForkError: java.lang.IllegalAccessError: Update to non-static final field org.apache.spark.SparkFunSuite.invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected attempted from a different method (org$scalatest$BeforeAndAfterAll$_setter_$invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected_$eq) than the initializer method <init> 
   [error] 	at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$_setter_$invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected_$eq(SparkFunSuite.scala:69)
   [error] 	at org.scalatest.BeforeAndAfterAll.$init$(BeforeAndAfterAll.scala:148)
   [error] 	at org.apache.spark.SparkFunSuite.<init>(SparkFunSuite.scala:70)
   [error] 	at org.apache.spark.sql.catalyst.optimizer.LimitPushdownSuite.<init>(LimitPushdownSuite.scala:29)
   [error] 	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
   [error] 	at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
   [error] 	at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
   [error] 	at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
   [error] 	at java.base/java.lang.reflect.ReflectAccess.newInstance(ReflectAccess.java:128)
   [error] 	at java.base/jdk.internal.reflect.ReflectionFactory.newInstance(ReflectionFactory.java:347)
   [error] 	at java.base/java.lang.Class.newInstance(Class.java:645)
   [error] 	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:454)
   [error] 	at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
   [error] 	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
   [error] 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
   [error] 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
   [error] 	at java.base/java.lang.Thread.run(Thread.java:833)
   ...
   
   [info] Run completed in 1 second, 569 milliseconds.
   [info] Total number of tests run: 0
   [info] Suites: completed 0, aborted 297
   [info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
   [info] *** 297 SUITES ABORTED ***
   [error] Error: Total 331, Failed 19, Errors 297, Passed 15
   [error] Failed tests:
   [error] 	org.apache.spark.sql.catalyst.expressions.RowBasedKeyValueBatchSuite
   [error] 	org.apache.spark.sql.connector.catalog.CatalogLoadingSuite
   [error] Error during tests:
   [error] 	org.apache.spark.sql.catalyst.optimizer.ReassignLambdaVariableIDSuite
   [error] 	org.apache.spark.sql.catalyst.plans.logical.AnalysisHelperSuite
   ...
   ```
   but when we change the Scala version to Scala 2.13 and then test with Java 17 again, 
   
   ```
   dev/change-scala-version.sh 2.13
   build/sbt clean "catalyst/test" -Pscala-2.13
   ```
   
   ```
   [info] Run completed in 2 minutes, 16 seconds.
   [info] Total number of tests run: 7138
   [info] Suites: completed 297, aborted 0
   [info] Tests: succeeded 7138, failed 0, canceled 1, ignored 5, pending 0
   [info] All tests passed.
   ```
   all test successful. 
   
   If this is the case, can we drop Scala 2.12 supports first, then upgrade the Java version? This seems to be a bit easier.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org