You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jorge Machado (Jira)" <ji...@apache.org> on 2023/01/03 11:19:00 UTC
[jira] [Commented] (SPARK-33772) Build and Run Spark on Java 17
[ https://issues.apache.org/jira/browse/SPARK-33772?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17653941#comment-17653941 ]
Jorge Machado commented on SPARK-33772:
---------------------------------------
I still have an issue with this. Running sbt test fails and I don't know why.
[error] Uncaught exception when running com.deutschebahn.zod.fvdl.commons.aws.athena.LocalViewCreatorTest: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
23/01/03 12:16:40 INFO Utils: Successfully started service 'sparkDriver' on port 49902.
[error] sbt.ForkMain$ForkError: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
[error] at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:114)
> Build and Run Spark on Java 17
> ------------------------------
>
> Key: SPARK-33772
> URL: https://issues.apache.org/jira/browse/SPARK-33772
> Project: Spark
> Issue Type: New Feature
> Components: Build
> Affects Versions: 3.3.0
> Reporter: Dongjoon Hyun
> Assignee: Yang Jie
> Priority: Major
> Labels: releasenotes
> Fix For: 3.3.0
>
>
> Apache Spark supports Java 8 and Java 11 (LTS). The next Java LTS version is 17.
> ||Version||Release Date||
> |Java 17 (LTS)|September 2021|
> Apache Spark has a release plan and `Spark 3.2 Code freeze` was July along with the release branch cut.
> - https://spark.apache.org/versioning-policy.html
> Supporting new Java version is considered as a new feature which we cannot allow to backport.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org