You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "eejbyfeldt (via GitHub)" <gi...@apache.org> on 2023/07/11 15:07:52 UTC

[GitHub] [spark] eejbyfeldt opened a new pull request, #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

eejbyfeldt opened a new pull request, #41943:
URL: https://github.com/apache/spark/pull/41943

   
   ### What changes were proposed in this pull request?
   Drop hardcoded `--target:jvm-1.8` value from scalac argument in pom.xml.
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   
   
   ### Why are the changes needed?
   Build using maven is broken using 2.13 and Java 11 or later.
   
   It fails with
   ```
   $ ./build/mvn compile -Pscala-2.13 -Djava.version=11 -X
   ...
   [WARNING] [Warn] : [deprecation @  | origin= | version=] -target is deprecated: Use -release instead to compile against the correct platform API.
   [ERROR] [Error] : target platform version 8 is older than the release version 11
   [WARNING] one warning found
   [ERROR] one error found
   ...
   ```
   if setting the `java.version` property or
   ```
   $ ./build/mvn compile -Pscala-2.13
   ...
   [WARNING] [Warn] : [deprecation @  | origin= | version=] -target is deprecated: Use -release instead to compile against the correct platform API.
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: not found: value sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: not found: object sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: not found: object sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:210: not found: type Unsafe
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:212: not found: type Unsafe
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: Unused import
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala:452: not found: value sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: not found: object sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type SignalHandler
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:83: not found: type Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: type SignalHandler
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: value Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:114: not found: type Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:116: not found: value Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:128: not found: value Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
   [WARNING] one warning found
   [ERROR] 23 errors found
   ...
   ```
   
   This is caused by that we in pom.xml hardcode that scalac should run with `-target:jvm-1.8` (regardless of the value of `java.version`) this was fine for scala 2.12.18 and scala 2.13.8 as the scala-maven-plugin would add the arg `-target` based on the `java.version` property. (https://github.com/davidB/scala-maven-plugin/blob/4.8.0/src/main/java/scala_maven/ScalaMojoSupport.java#L629-L648) since this argument is later it took precedence over the value we hardcoded in maven and everything works as expected.
   
   The problem comes in scala 2.13.11 where `-target` is deprecated and therefore the scala-maven-plugin uses the `-release` argument instead. The first second failure about not being able to accessing `sun._` packages which is expected behvaior when using `-release 8` see: https://github.com/scala/bug/issues/12643  but if one sets `--release 11` when using Java 11 access to `sun._` compile just fine.
   
   
   Note: That builds using scala 2.13 and java 11 or later without setting `java.version` to the appropriate value will still fail.
   
   Note2: The java 8 builds still succeeds as the `rt.jar`  is pased on the `-bootclasspath` when using java8.
   
   
   ### Does this PR introduce _any_ user-facing change?
   Fixes the maven build when using scala 2.13 and Java 11 or later. 
   
   
   ### How was this patch tested?
   Exising CI builds and manual builds locally.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1670065987

   Scenario 1:If the minimum support is Java 11, build with Java 11 and test with Java 17 and Java 21-ea.
   
   Modify `pom.xml`
   1. Change `java.version` to 11
   2. Change `-target:jvm-1.8` to `-target:11`.
   
   Build & Test
   
   1. run `build/mvn clean install -DskipTests -Pscala-2.13` with Java 11
   
   2. run `build/mvn test -Pscala-2.13` with Java 17
   
   failed:
   
   ```
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: object security is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: object nio is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
   [WARNING] one warning found
   [ERROR] 7 errors found
   ```
   
   3.run `build/mvn test -Pscala-2.13` with Java 21
   
   failed:
   
   ```
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: object security is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: object nio is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
   [WARNING] one warning found
   [ERROR] 7 errors found
   ```
   
   
   Scenario 2:If the minimum support is Java 17, build with Java 17 and test with Java 21-ea
   
   Modify `pom.xml`
   1. Change `java.version` to 17
   2. Change `-target:jvm-1.8` to `-target:17`.
   
   Build & Test
   
   1. run `build/mvn clean install -DskipTests -Pscala-2.13` with Java 17
   
   2. run `build/mvn test -Pscala-2.13` with Java 21-ea
   
   failed:
   
   ```
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: object security is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: object nio is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
   [WARNING] one warning found
   [ERROR] 7 errors found
   ```
   
   The test results are not as we expected, more investigation is needed to solve this problem.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1658336924

   > I think the issue is you will target Java 17 bytecode if running on 17, when we want to target 8 in all cases
   
   If that is the case then the changes currently in this PR are not what we want. But are we really sure that this is something that is expected or used? Because as far as I can tell this is not something that actually worked in the passed. If I take a spark 3.4.1 build that I build using the v3.4.1 tag on Java 11 and then try to run `spark-submit run-example SparkPi` on Java 8 it fails with
   ```
   2023-07-31 14:59:15,304 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) failed in 0.259 s due to Job aborted due to stage failure: Task serialization failed: java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
   java.lang.NoSuchMethodError: java.nio.ByteBuffer.flip()Ljava/nio/ByteBuffer;
   	at org.apache.spark.util.io.ChunkedByteBufferOutputStream.toChunkedByteBuffer(ChunkedByteBufferOutputStream.scala:115)
   	at org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:362)
   	at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:160)
   	at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:99)
   	at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:38)
   	at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:78)
   	at org.apache.spark.SparkContext.broadcastInternal(SparkContext.scala:1548)
   	at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1530)
   	at org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1535)
   	at org.apache.spark.scheduler.DAGScheduler.submitStage(DAGScheduler.scala:1353)
   	at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:1295)
   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2931)
   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2923)
   	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2912)
   	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
   ```
   
   I think the problem here boils down to that we previously have used the scalac arg `--target` to attempt to achieve what you describe. But according to the comment here (https://github.com/scala/bug/issues/12643#issuecomment-1253761646) 
   
   > -target says "emit class file of version N, but I want to use arbitrary classes from the JDK and take my chances".
   
   so only specifying `-target` is not the proper way to build on a later Java version and target Java 8. My understanding is that if that is what we actually want then we would need specify the java version using `-release` and actually fix the build errors that it causes.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1659721894

   @eejbyfeldt can we change to use `-release:8`?
   
   I have made the following changes based on your pr:
   1. upgrade `scala-maven-plugin` from 4.8.0 to 4.8.1
   2. change `-target:jvm-1.8` to `-release:8`, both line 2911 and line 3652
   
   then I test
   ```
   java -version
   openjdk version "17.0.8" 2023-07-18 LTS
   OpenJDK Runtime Environment Zulu17.44+15-CA (build 17.0.8+7-LTS)
   OpenJDK 64-Bit Server VM Zulu17.44+15-CA (build 17.0.8+7-LTS, mixed mode, sharing)
   
   ./build/mvn -DskipTests clean package   
    ./build/mvn clean compile -Pscala-2.13 
   ./build/mvn clean compile -Pscala-2.13
   ```
   
   Both Scala 2.12 and Scala 2.13 with Java 17 build successfully, and the `-release` always `8`.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1661617217

   > Assuming that a Java 8 build works on Java 17.
   
   The problem is that this does not work with the current code as building with `-release 8` will for bid access to the classes in `sun.*`. That is basically the failing build I posted in my last comment.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1660380676

   Can we set java.version to 8? 
   I just don't see a direct use case in the current Spark build for targeting higher java versions.
   Assuming that a Java 8 build works on Java 17.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1666623057

   Yeah, I also confirmed this fails by testing the 3.5.0 RC1 build. If you build with Java 8, but test on Java 17, it won't work. We want this combination to work. If 2.13.8 still works as before, I believe we should use that. (And I think we should drop java 8 support in 4.0)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1666393843

   Due to this issue, should we downgrade the Scala 2.13 version to 2.13.8 in branch-3.5?  also cc @dongjoon-hyun 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1631786686

   It's my bad, I only tested Java 11 and 17 under SBT and missed the Maven scenario when upgrade Scala 2.13.11
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] pan3793 commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "pan3793 (via GitHub)" <gi...@apache.org>.
pan3793 commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1633471352

   The `java.version` can be resolved by
   
   ```
          <profile>
               <id>java-8</id>
               <activation>
                   <jdk>1.8</jdk>
               </activation>
               <properties>
                   <java.version>1.8</java.version>
               </properties>
           </profile>
   
           <profile>
               <id>java-11</id>
               <activation>
                   <jdk>11</jdk>
               </activation>
               <properties>
                   <java.version>11</java.version>
               </properties>
           </profile>
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1657903238

   > The `java.version` can be resolved by
   > 
   > ```
   >        <profile>
   >             <id>java-8</id>
   >             <activation>
   >                 <jdk>1.8</jdk>
   >             </activation>
   >             <properties>
   >                 <java.version>1.8</java.version>
   >             </properties>
   >         </profile>
   > 
   >         <profile>
   >             <id>java-11</id>
   >             <activation>
   >                 <jdk>11</jdk>
   >             </activation>
   >             <properties>
   >                 <java.version>11</java.version>
   >             </properties>
   >         </profile>
   > ```
   
   Is this suggestion works? If it works, we don't need to manually specify `-Djava.version=`?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] pan3793 commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "pan3793 (via GitHub)" <gi...@apache.org>.
pan3793 commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1631184123

   A quick question: previously, the output artifacts are runnable on JDK 8 whatever the building JDK version is. is it true after this change?
   
   cc @LuciferYang


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41943:
URL: https://github.com/apache/spark/pull/41943#discussion_r1260547057


##########
pom.xml:
##########
@@ -2899,7 +2899,6 @@
               <arg>-deprecation</arg>
               <arg>-feature</arg>
               <arg>-explaintypes</arg>
-              <arg>-target:jvm-1.8</arg>

Review Comment:
   How about change to `<arg>-target:jvm-${java.version}</arg>` ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1658280182

   I think the issue is you will target Java 17 bytecode if running on 17, when we want to target 8 in all cases


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1631892085

   > @eejbyfeldt Do you know why SBT is not failed?
   
   The failures are not directly related to scala 2.13.11. They are caused by the maven plugin we are using for running scalac. For versions greater or equal to `2.13.9` that plugin sets `-release` instead of `-target` (https://github.com/davidB/scala-maven-plugin/blob/4.8.0/src/main/java/scala_maven/ScalaMojoSupport.java#L629-L648)  And as noted https://github.com/scala/bug/issues/12643 and https://github.com/scala/bug/issues/12824 replacing `-target` with `-release` is **not** a noop and has slightly different meaning and behvaior. My unserstanding in that in the sbt build we only set `-target` and we do not use `-release` and therefore we do not run into the same issues with the sbt build.
   
   
   > A quick question: previously, the output artifacts are runnable on JDK 8 whatever the building JDK version is. is it true after this change?
   
   Based on the example provided in https://github.com/scala/bug/issues/12824 that using `-target:8` on java 17 is unsafe and can cause runtime failures. I am not sure we actually had that guarantee before. That said as long as one does not override `java.version` my understanding is that we get the same behavior as before and therefore the same compatibility.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1632687298

   > I think the issue is that you end up compiling against a later version of JDK > If this change works for Java 8 and later, it seems fine.
   
   There are still issues compiling 2.13 with newer newer java like 11 and not setting java.version to 11. When not setting the java.version property the scala-maven-plugin will set `-release 8` which will not allow access to `sun.*` classes.  This is the case currently in master and that is not resolved in this PR.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on a diff in pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on code in PR #41943:
URL: https://github.com/apache/spark/pull/41943#discussion_r1260619349


##########
pom.xml:
##########
@@ -2899,7 +2899,6 @@
               <arg>-deprecation</arg>
               <arg>-feature</arg>
               <arg>-explaintypes</arg>
-              <arg>-target:jvm-1.8</arg>

Review Comment:
   Yeah, that should work as well.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1631819222

   @eejbyfeldt Do you know why SBT is not failed?
   
   https://github.com/apache/spark/blob/6c02bd0198d250133683d09d023c9136eae1c154/project/SparkBuild.scala#L352-L369
   
   ```
   java -version
   openjdk version "11.0.18" 2023-01-17 LTS
   OpenJDK Runtime Environment Zulu11.62+17-CA (build 11.0.18+10-LTS)
   OpenJDK 64-Bit Server VM Zulu11.62+17-CA (build 11.0.18+10-LTS, mixed mode)
   ```
   
   `build/sbt compile -Pscala-2.13` can successful and It looks like `-target:jvm-${javaVersion.value}` is `-target:jvm-1.8`
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1632437620

   I think the issue is that you end up compiling against a later version of JDK libraries, which is not necessarily compatible, even if emitting bytecode for a lower Java version. However, yeah we've always accepted that and test it in CI/CD to make sure it works.
   If this change works for Java 8 and later, it seems fine.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1720762469

   Closing this as it no longer relevant. In the 3.5 scala 2.13 was downgraded and the update will be done in https://github.com/apache/spark/pull/42918


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1666680048

   I submitted a PR to test the `branch-3.5` branch with Scala 2.13.8, I will update the PR description later.
   
   https://github.com/apache/spark/pull/42362


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1662127454

   OK, got it. The problem is, I don't think it helps to target Java 11 here, as it will just make the release unusable on Java 8 right?
   
   Is this workaround possible? https://github.com/scala/bug/issues/12643#issuecomment-1274143835
   
   Or else, just don't further upgrade Scala 2.13 until Java 8 support is dropped. That could reasonably happen in Spark 4


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1632848362

   > I think we target Java 8 for release builds and then want that to run on Java 11+.
   
   I think this is ok now


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1632866561

   I think the current issue is that we may have to specify the Java version through `-Djava.version` when using other Java versions for Maven build&test
   
   I do the following experiments:
   
   1. hardcode the `java.version` in `pom.xml` to 11
   2. maven build using Java 17:
   
   ```
   build/mvn clean install -DskipTests -Pscala-2.13
   ```
   
   this will failed due to 
   
   ```
   [WARNING] [Warn] : [deprecation @  | origin= | version=] -target is deprecated: Use -release instead to compile against the correct platform API.
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: object security is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: object nio is not a member of package sun
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
   [ERROR] [Error] /Users/yangjie01/SourceCode/git/spark-mine-13/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
   ```
   
   We have to use the following command to build using Java 17 now(Current usage in Github action):
   
   ```
   build/mvn clean install -DskipTests -Pscala-2.13 -Djava.version=17
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1632902688

   I think the idea is you build with java 8, and test with 11/17 - does that work? or that is certainly what we want, to have one release that works across all the java versions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1633468931

   > I think the idea is you build with java 8, and test with 11/17 - does that work? or that is certainly what we want, to have one release that works across all the java versions.
   
   I think this is ok.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1659753792

   > change -target:jvm-1.8 to -release:8, both line 2911 and line 3652
   
   Hardcoding `-release:8` with the new default activation will not actually setting the `-release` config to 8. This is because the scala-maven-plugin will also append a `-release` flag based on the property "java.version". Since the one appended by scala-maven-plugin is later in the list of args it takes precedence.  So while yes doing like you suggest will compile it will not have created a java 8 release. The args can be seen by passing `-X` to maven
   
   ```
   $ ./build/mvn clean compile -Pscala-2.13 -X
   ...
   [DEBUG] [zinc] Running cached compiler 76b0ae1b for Scala compiler version 2.13.11
   [DEBUG] [zinc] The Scala compiler is invoked with:
           -unchecked
           -deprecation
           -feature
           -explaintypes
           -release:8
           -Wconf:cat=deprecation:wv,any:e
           -Wunused:imports
           -Wconf:cat=scaladoc:wv
           -Wconf:cat=lint-multiarg-infix:wv
           -Wconf:cat=other-nullary-override:wv
           -Wconf:cat=other-match-analysis&site=org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction.catalogFunction:wv
           -Wconf:cat=other-pure-statement&site=org.apache.spark.streaming.util.FileBasedWriteAheadLog.readAll.readFile:wv
           -Wconf:cat=other-pure-statement&site=org.apache.spark.scheduler.OutputCommitCoordinatorSuite.<local OutputCommitCoordinatorSuite>.futureAction:wv
           -Wconf:msg=^(?=.*?method|value|type|object|trait|inheritance)(?=.*?deprecated)(?=.*?since 2.13).+$:s
           -Wconf:msg=^(?=.*?Widening conversion from)(?=.*?is deprecated because it loses precision).+$:s
           -Wconf:msg=Auto-application to \`\(\)\` is deprecated:s
           -Wconf:msg=method with a single empty parameter list overrides method without any parameter list:s
           -Wconf:msg=method without a parameter list overrides a method with a single empty one:s
           -Wconf:cat=deprecation&msg=procedure syntax is deprecated:e
           -Wconf:cat=unchecked&msg=outer reference:s
           -Wconf:cat=unchecked&msg=eliminated by erasure:s
           -Wconf:msg=^(?=.*?a value of type)(?=.*?cannot also be).+$:s
           -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBase.scala:s
           -Wconf:cat=unused-imports&src=org\/apache\/spark\/graphx\/impl\/VertexPartitionBaseOps.scala:s
           -Wconf:msg=Implicit definition should have explicit type:s
           -release
           17
           -bootclasspath
           /home/eejbyfeldt/.m2/repository/org/scala-lang/scala-library/2.13.11/scala-library-2.13.11.jar
           -classpath
           /home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-io/9.4.50.v20221201/jetty-io-9.4.50.v20221201.jar:/home/eejbyfeldt/.m2/repository/org/slf4j/slf4j-api/2.0.7/slf4j-api-2.0.7.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-client/9.4.51.v20230217/jetty-client-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-http/9.4.51.v20230217/jetty-http-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/eclipse/jetty/jetty-util/9.4.51.v20230217/jetty-util-9.4.51.v20230217.jar:/home/eejbyfeldt/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/home/eejbyfeldt/dev/apache/spark/common/tags/target/scala-2.13/classes:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-reflect/2.13.11/scala-reflect-2.13.11.jar:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-compiler/2.13.11/scala-compiler-2.13.11.jar:/home/eejbyfeldt/.m2/repository/io/github/java-diff-utils/java-diff-utils/4.12/java-diff-utils-4.12.jar:/home/eejbyfeld
 t/.m2/repository/org/jline/jline/3.22.0/jline-3.22.0.jar:/home/eejbyfeldt/.m2/repository/net/java/dev/jna/jna/5.13.0/jna-5.13.0.jar:/home/eejbyfeldt/.m2/repository/org/scala-lang/scala-library/2.13.11/scala-library-2.13.11.jar
   ...
   ```
   
   Running with `-Djava.version=8` will set to release to 8 properly and then compilation fails with:
   ```
   $ ./build/mvn clean compile -Pscala-2.13 -Djava.version=8
   ...
   [INFO] Compiler bridge file: /home/eejbyfeldt/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.13-1.8.0-bin_2.13.11__61.0-1.8.0_20221110T195421.jar
   [INFO] compiling 603 Scala sources and 77 Java sources to /home/eejbyfeldt/dev/apache/spark/core/target/scala-2.13/classes ...
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:71: not found: value sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: not found: object sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: not found: object sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:206: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:210: not found: type Unsafe
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:212: not found: type Unsafe
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:213: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:216: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:236: not found: type DirectBuffer
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:26: Unused import
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/storage/StorageUtils.scala:27: Unused import
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala:452: not found: value sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: not found: object sun
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type SignalHandler
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:99: not found: type Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:83: not found: type Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: type SignalHandler
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:108: not found: value Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:114: not found: type Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:116: not found: value Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:128: not found: value Signal
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
   [ERROR] [Error] /home/eejbyfeldt/dev/apache/spark/core/src/main/scala/org/apache/spark/util/SignalUtils.scala:26: Unused import
   [ERROR] 23 errors found
   ```
   
   and based on the discussion in https://github.com/scala/bug/issues/12643 I belive this is the expected behavior.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1659776259

   @eejbyfeldt Thank you for your response, I don’t have any further suggestions for now.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1658031228

   > Is this suggestion works? If it works, we don't need to manually specify -Djava.version=?
   
   Added it to this PR. Seems to work based on my testing locally. 
   
   I did not add it immediately as it seemed like a change for how it worked now, and it was not clear to be everyone agreed that was a desired solution.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1631834692

   friendly ping @dongjoon-hyun WDYT?
   also cc @srowen 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1632762625

   I think we target Java 8 for release builds and then want that to run on Java 11+. Does that work, or did that already work?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt closed pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt closed pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later
URL: https://github.com/apache/spark/pull/41943


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] eejbyfeldt commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "eejbyfeldt (via GitHub)" <gi...@apache.org>.
eejbyfeldt commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1657900414

   Anyone opinions how we should proceed with this? Would be nice to have this fixed in the 3.5 branch as not being able to build with java 11 or newer is a regression compared to previous spark releases.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srowen commented on pull request #41943: [SPARK-44376][BUILD] Fix maven build using scala 2.13 and Java 11 or later

Posted by "srowen (via GitHub)" <gi...@apache.org>.
srowen commented on PR #41943:
URL: https://github.com/apache/spark/pull/41943#issuecomment-1658582203

   Right, targeting Java 8 bytecode is necessary but not sufficient to run on Java 8. I think that's why we build releases on Java 8. These releases should still work on later Java releases, at least that's what the CI jobs are trying to test.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org