You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/09/06 03:21:06 UTC

[GitHub] [spark] Yikun commented on pull request #37799: [SPARK-40331][DOCS] Recommend use Java 11/17 as the runtime environment of Spark

Yikun commented on PR #37799:
URL: https://github.com/apache/spark/pull/37799#issuecomment-1237615134

   Some info FYI:
   1. OpenJDK 8 has [more longer time](https://access.redhat.com/articles/1299013#OpenJDK_Life_Cycle) before EOL (2026) than OpenJDK 11 (2024), it shows the upstream community attitude of Java 8 in some level.
   2. For some initial aarch64 port: https://openjdk.org/jeps/237 were introduced in Java 9, and Java 8 backport this.
   3. Java 11 have more performance enhancement in aarch64, For some aarch64 improvement like: https://openjdk.org/jeps/315 only be supported in java 11 but java 8 not.
   4. IIRC, I had some discussion the JDK experts on our internal team, OpenJDK 11 contains many experimental feature, and these features are stable on OpenJDK 17, Java 11 might be a transition version.
   5. MacOS java support introduced in Java17. https://openjdk.org/jeps/391
   
   Above all, personally think, unless we see that Java 11 has advantages in most scenarios in Apache Spark, otherwise users should choose the appropriate JDK among 8, 11, 17 according to their own situation.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org