You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/13 08:54:21 UTC
[jira] [Updated] (SPARK-15489) Dataset kryo encoder won't load
custom user settings
[ https://issues.apache.org/jira/browse/SPARK-15489?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-15489:
------------------------------
Assignee: Amit Sela
> Dataset kryo encoder won't load custom user settings
> -----------------------------------------------------
>
> Key: SPARK-15489
> URL: https://issues.apache.org/jira/browse/SPARK-15489
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.1
> Reporter: Amit Sela
> Assignee: Amit Sela
> Fix For: 2.0.0
>
>
> When setting a custom "spark.kryo.registrator" (or any other configuration for that matter) through the API, this configuration will not propagate to the encoder that uses a KryoSerializer since it instantiates with "new SparkConf()".
> See: https://github.com/apache/spark/blob/07c36a2f07fcf5da6fb395f830ebbfc10eb27dcc/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala#L554
> This could be hacked by providing those configurations as System properties, but this probably should be passed to the encoder and set in the SerializerInstance after creation.
> Example:
> When using Encoders with kryo to encode generically typed Objects in the following manner:
> public static <T> Encoder<T> encoder() {
> return Encoders.kryo((Class<T>) Object.class);
> }
> I get a decoding exception when trying to decode `java.util.Collections$UnmodifiableCollection`, which probably comes from Guava's `ImmutableList`.
> This happens when running with master = local[1]. Same code had no problems with RDD api.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org