You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/09/05 06:45:31 UTC

[GitHub] [spark] LuciferYang commented on a diff in pull request #37799: [SPARK-40331][DOCS] Recommend use Java 11+ as the runtime environment of Spark

LuciferYang commented on code in PR #37799:
URL: https://github.com/apache/spark/pull/37799#discussion_r962537598


##########
docs/index.md:
##########
@@ -40,8 +40,8 @@ source, visit [Building Spark](building-spark.html).
 Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS), and it should run on any platform that runs a supported version of Java. This should include JVMs on x86_64 and ARM64. It's easy to run locally on one machine --- all you need is to have `java` installed on your system `PATH`, or the `JAVA_HOME` environment variable pointing to a Java installation.
 
 Spark runs on Java 8/11/17, Scala 2.12/2.13, Python 3.7+ and R 3.5+.
+Java 11+ is recommended as the runtime environment of Spark.

Review Comment:
   Good to me



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org