You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Samuel Souza (Jira)" <ji...@apache.org> on 2021/08/31 17:06:00 UTC

[jira] [Created] (SPARK-36627) Tasks with Java proxy objects fail to desrialize

Samuel Souza created SPARK-36627:
------------------------------------

             Summary: Tasks with Java proxy objects fail to desrialize
                 Key: SPARK-36627
                 URL: https://issues.apache.org/jira/browse/SPARK-36627
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.0.3
            Reporter: Samuel Souza


In JavaSerializer.JavaDeserializationStream we override resolveClass of ObjectInputStream to use the threads' contextClassLoader. However, we do not override resolveProxyClass, which is used when deserializing Java proxy objects, which makes spark use the wrong classloader when deserializing objects, which causes the job to fail with the following exception:

Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 4, <host>, executor 1): java.lang.ClassNotFoundException: <class>
	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
	at java.base/java.lang.Class.forName0(Native Method)
	at java.base/java.lang.Class.forName(Class.java:398)
	at java.base/java.io.ObjectInputStream.resolveProxyClass(ObjectInputStream.java:829)
	at java.base/java.io.ObjectInputStream.readProxyDesc(ObjectInputStream.java:1917)
	...
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org