You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Louis Bergelson (JIRA)" <ji...@apache.org> on 2017/05/18 19:19:04 UTC
[jira] [Commented] (SPARK-20389) Upgrade kryo to fix
NegativeArraySizeException
[ https://issues.apache.org/jira/browse/SPARK-20389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16016305#comment-16016305 ]
Louis Bergelson commented on SPARK-20389:
-----------------------------------------
What's the process for evaluating the effect on spark apps? As far as I can tell the changes from 3->4 will mean that data serialized with an older version of kryo will not be loadable by a new version of kryo unless you run with a compatibility option configured. Hopefully most apps aren't storing data that's been serialized by kryo and only using it for serialization between processes. I don't know if this has knock on effects for things like parquet though, does it use kryo?
We have a recurrent issue when serializing large objects that is fixed in kryo 4 and would really like to see spark updated.
> Upgrade kryo to fix NegativeArraySizeException
> ----------------------------------------------
>
> Key: SPARK-20389
> URL: https://issues.apache.org/jira/browse/SPARK-20389
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, Spark Submit
> Affects Versions: 2.1.0
> Environment: Linux, Centos7, jdk8
> Reporter: Georg Heiler
>
> I am experiencing an issue with Kryo when writing parquet files. Similar to https://github.com/broadinstitute/gatk/issues/1524 a NegativeArraySizeException occurs. Apparently this is fixed in a current Kryo version. Spark is still using the very old 3.3 Kryo.
> Can you please upgrade to a fixed Kryo version.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org