You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lorenz Bühmann (Jira)" <ji...@apache.org> on 2020/07/12 11:27:00 UTC

[jira] [Created] (SPARK-32283) Multiple Kryo registrators can't be used anymore

Lorenz Bühmann created SPARK-32283:
--------------------------------------

             Summary: Multiple Kryo registrators can't be used anymore
                 Key: SPARK-32283
                 URL: https://issues.apache.org/jira/browse/SPARK-32283
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.0.0
            Reporter: Lorenz Bühmann


This is a regression in Spark 3.0 as it is working with Spark 2.

According to the docs, it should be possible to register multiple Kryo registrators via Spark config option spark.kryo.registrator . 

In Spark 3.0 the code to parse Kryo config options has been refactored into Scala class [Kryo|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/config/Kryo.scala]. The code to parse the registrators is in [Line 29-32|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/config/Kryo.scala#L29-L32]

{code:scala}
val KRYO_USER_REGISTRATORS = ConfigBuilder("spark.kryo.registrator")
    .version("0.5.0")
    .stringConf
    .createOptional
{code}
but it should be
{code:scala}
val KRYO_USER_REGISTRATORS = ConfigBuilder("spark.kryo.registrator")
    .version("0.5.0")
    .stringConf
    .toSequence
    .createOptional
{code}
 to split the comma seprated list.

In previous Spark 2.x it was done differently directly in [KryoSerializer Line 77-79|https://github.com/apache/spark/blob/branch-2.4/core/src/main/scala/org/apache/spark/serializer/KryoSerializer.scala#L77-L79]

{code:scala}
private val userRegistrators = conf.get("spark.kryo.registrator", "")
    .split(',').map(_.trim)
    .filter(!_.isEmpty)
{code}

Hope this helps.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org