You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:34:50 UTC

[jira] [Resolved] (SPARK-7043) KryoSerializer cannot be used with REPL to interpret code in which case class definition and its shipping are in the same line

     [ https://issues.apache.org/jira/browse/SPARK-7043?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-7043.
---------------------------------
    Resolution: Incomplete

> KryoSerializer cannot be used with REPL to interpret code in which case class definition and its shipping are in the same line
> ------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-7043
>                 URL: https://issues.apache.org/jira/browse/SPARK-7043
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.3.1
>         Environment: Ubuntu 14.04, no hadoop
>            Reporter: Peng Cheng
>            Priority: Minor
>              Labels: bulk-closed, classloader, kryo
>   Original Estimate: 48h
>  Remaining Estimate: 48h
>
> When deploying Spark-shell with
> "spark.serializer=org.apache.spark.serializer.KryoSerializer" option. Spark-shell cannot execute the following code (in 1 line):
>     case class Foo(i: Int);val ret = sc.parallelize((1 to 100).map(Foo), 10).collect()
> This problem won't exist for either JavaSerializer or code splitted into 2 lines. The only possible explanation is that KryoSerializer is using a ClassLoader that is not registered as an subsidiary ClassLoader of the one in REPL.
> A "dirty" fix would be just breaking input by semicolon, but its better to fix the ClassLoader to avoid other liabilities.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org