You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2016/11/29 17:31:58 UTC

[jira] [Updated] (SPARK-18546) UnsafeShuffleWriter corrupts encrypted shuffle files when merging

     [ https://issues.apache.org/jira/browse/SPARK-18546?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin updated SPARK-18546:
-----------------------------------
    Target Version/s: 2.1.0

> UnsafeShuffleWriter corrupts encrypted shuffle files when merging
> -----------------------------------------------------------------
>
>                 Key: SPARK-18546
>                 URL: https://issues.apache.org/jira/browse/SPARK-18546
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.1.0
>            Reporter: Marcelo Vanzin
>            Assignee: Marcelo Vanzin
>            Priority: Critical
>
> The merging algorithm in {{UnsafeShuffleWriter}} does not consider encryption, and when it tries to merge encrypted files the result data cannot be read, since data encrypted with different initial vectors is interleaved in the same partition data. This leads to exceptions when trying to read the files during shuffle:
> {noformat}
> com.esotericsoftware.kryo.KryoException: com.ning.compress.lzf.LZFException: Corrupt input data, block did not start with 2 byte signature ('ZV') followed by type byte, 2-byte length)
> 	at com.esotericsoftware.kryo.io.Input.fill(Input.java:142)
> 	at com.esotericsoftware.kryo.io.Input.require(Input.java:155)
> 	at com.esotericsoftware.kryo.io.Input.readInt(Input.java:337)
> 	at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:109)
> 	at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:610)
> 	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:721)
> 	at org.apache.spark.serializer.KryoDeserializationStream.readObject(KryoSerializer.scala:228)
> 	at org.apache.spark.serializer.DeserializationStream.readKey(Serializer.scala:169)
> 	at org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.readNextItem(ExternalAppendOnlyMap.scala:512)
> 	at org.apache.spark.util.collection.ExternalAppendOnlyMap$DiskMapIterator.hasNext(ExternalAppendOnlyMap.scala:533)
> ...
> {noformat}
> (This is our internal branch so don't worry if lines don't necessarily match.)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org