You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Ash (JIRA)" <ji...@apache.org> on 2014/09/22 03:28:33 UTC
[jira] [Created] (SPARK-3630) Identify cause of Kryo+Snappy
PARSING_ERROR
Andrew Ash created SPARK-3630:
---------------------------------
Summary: Identify cause of Kryo+Snappy PARSING_ERROR
Key: SPARK-3630
URL: https://issues.apache.org/jira/browse/SPARK-3630
Project: Spark
Issue Type: Task
Components: Spark Core
Affects Versions: 1.1.0
Reporter: Andrew Ash
Assignee: Ankur Dave
A recent GraphX commit caused non-deterministic exceptions in unit tests so it was reverted (see SPARK-3400).
Separately, [~aash] observed the same exception stacktrace in an application-specific Kryo registrator:
{noformat}
com.esotericsoftware.kryo.KryoException: java.io.IOException: failed to uncompress the chunk: PARSING_ERROR(2) com.esotericsoftware.kryo.io.Input.fill(Input.java:142) com.esotericsoftware.kryo.io.Input.require(Input.java:169) com.esotericsoftware.kryo.io.Input.readInt(Input.java:325) com.esotericsoftware.kryo.io.Input.readFloat(Input.java:624) com.esotericsoftware.kryo.serializers.DefaultSerializers$FloatSerializer.read(DefaultSerializers.java:127) com.esotericsoftware.kryo.serializers.DefaultSerializers$FloatSerializer.read(DefaultSerializers.java:117) com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732) com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:109) com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18) com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:732)
{noformat}
This ticket is to identify the cause of the exception in the GraphX commit so the faulty commit can be fixed and merged back into master.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org