You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2023/03/10 17:21:00 UTC

[jira] [Resolved] (SPARK-42036) Kryo ClassCastException getting task result when JDK versions mismatch

     [ https://issues.apache.org/jira/browse/SPARK-42036?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen resolved SPARK-42036.
----------------------------------
    Resolution: Not A Problem

Mismatching java versions would never be supported per se

> Kryo ClassCastException getting task result when JDK versions mismatch
> ----------------------------------------------------------------------
>
>                 Key: SPARK-42036
>                 URL: https://issues.apache.org/jira/browse/SPARK-42036
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.3.0
>            Reporter: John Zhuge
>            Priority: Major
>
> {noformat}
> 22/12/21 01:27:12 ERROR TaskResultGetter: Exception while getting task result
> com.esotericsoftware.kryo.KryoException: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.nio.ByteBuffer
> Serialization trace:
> lowerBounds (org.apache.iceberg.GenericDataFile)
> taskFiles (org.apache.iceberg.spark.source.SparkWrite$TaskCommit)
> writerCommitMessage (org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTaskResult)
> 	at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:144)
> {noformat}
> Iceberg 1.1 `BaseFile.lowerBounds` is defined as
> {code:java}
> Map<Integer, ByteBuffer> {code}
> Driver JDK version: 1.8.0_352 (Azul Systems, Inc.)
> Executor JDK version: openjdk version "17.0.5" 2022-10-18 LTS
> Kryo version: 4.0.2
>  
> Same Spark job works when both driver and executors run the same JDK 8 or JDK 17.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org