You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2021/11/11 16:58:00 UTC

[jira] [Closed] (SPARK-35557) Adapt uses of JDK 17 Internal APIs

     [ https://issues.apache.org/jira/browse/SPARK-35557?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun closed SPARK-35557.
---------------------------------

> Adapt uses of JDK 17 Internal APIs
> ----------------------------------
>
>                 Key: SPARK-35557
>                 URL: https://issues.apache.org/jira/browse/SPARK-35557
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>    Affects Versions: 3.2.0
>            Reporter: Ismaël Mejía
>            Priority: Major
>
> I tried to run a Spark pipeline using the most recent 3.2.0-SNAPSHOT with Spark 2.12.4 on Java 17 and I found this exception:
> {code:java}
> java.lang.ExceptionInInitializerError
>  at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$.<clinit> (package.scala:1149)
>  at org.apache.spark.SparkConf$.<clinit> (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)
> ...
> Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module @110df513
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible (AccessibleObject.java:357)
>  at java.lang.reflect.AccessibleObject.checkCanSetAccessible (AccessibleObject.java:297)
>  at java.lang.reflect.Constructor.checkCanSetAccessible (Constructor.java:188)
>  at java.lang.reflect.Constructor.setAccessible (Constructor.java:181)
>  at org.apache.spark.unsafe.Platform.<clinit> (Platform.java:56)
>  at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit> (ByteArrayMethods.java:54)
>  at org.apache.spark.internal.config.package$.<clinit> (package.scala:1149)
>  at org.apache.spark.SparkConf$.<clinit> (SparkConf.scala:654)
>  at org.apache.spark.SparkConf.contains (SparkConf.scala:455)}}
> {code}
> It seems that Java 17 will be more strict about uses of JDK Internals [https://openjdk.java.net/jeps/403]



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org