You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ryan Williams (JIRA)" <ji...@apache.org> on 2017/07/29 00:08:00 UTC

[jira] [Created] (SPARK-21569) Internal Spark class needs to be kryo-registered

Ryan Williams created SPARK-21569:
-------------------------------------

             Summary: Internal Spark class needs to be kryo-registered
                 Key: SPARK-21569
                 URL: https://issues.apache.org/jira/browse/SPARK-21569
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0
            Reporter: Ryan Williams


[Full repro here|https://github.com/ryan-williams/spark-bugs/tree/hf]

As of 2.2.0, {{saveAsNewAPIHadoopFile}} jobs fail (when {{spark.kryo.registrationRequired=true}}) with:

{code}
java.lang.IllegalArgumentException: Class is not registered: org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage
Note: To register this class use: kryo.register(org.apache.spark.internal.io.FileCommitProtocol$TaskCommitMessage.class);
	at com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:458)
	at com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:79)
	at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:488)
	at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:593)
	at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:315)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:383)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
{code}

This internal Spark class should be kryo-registered by Spark by default.

This was not a problem in 2.1.1.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org