You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bogdan Ghidireac (JIRA)" <ji...@apache.org> on 2014/05/19 08:43:38 UTC
[jira] [Commented] (SPARK-1877) ClassNotFoundException when loading
RDD with serialized objects
[ https://issues.apache.org/jira/browse/SPARK-1877?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14001423#comment-14001423 ]
Bogdan Ghidireac commented on SPARK-1877:
-----------------------------------------
Submitted the patch as a GitHub pull request.
https://github.com/apache/spark/pull/821
> ClassNotFoundException when loading RDD with serialized objects
> ---------------------------------------------------------------
>
> Key: SPARK-1877
> URL: https://issues.apache.org/jira/browse/SPARK-1877
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.0.0
> Environment: standalone Spark cluster, jdk 1.7
> Reporter: Bogdan Ghidireac
>
> When I load a RDD that has custom serialized objects, Spark throws ClassNotFoundException. This happens only when Spark is deployed as a standalone cluster, it works fine when Spark is local.
> I debugged the issue and I noticed that ObjectInputStream.resolveClass does not use ExecutorURLClassLoader set by SparkSubmit. You have to explicitly set the classloader in SparkContext.objectFile for ObjectInputStream when deserializing objects.
> Utils.deserialize[Array[T]](...., Thread.currentThread.getContextClassLoader)
> I will attach a patch shortly...
--
This message was sent by Atlassian JIRA
(v6.2#6252)