You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Allen (JIRA)" <ji...@apache.org> on 2019/03/28 23:39:00 UTC

[jira] [Commented] (SPARK-27216) Kryo serialization with RoaringBitmap

    [ https://issues.apache.org/jira/browse/SPARK-27216?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16804447#comment-16804447 ] 

Andrew Allen commented on SPARK-27216:
--------------------------------------

Thanks for this bug report, [~cltlfcjin]. We we running into "FetchFailedException: Received a zero-size buffer for block shuffle_0_0_332 from BlockManagerId" and this bug report gave us enough of the hint to try {{config.set("spark.kryo.unsafe", "false")}} which worked-around the issue.

> Kryo serialization with RoaringBitmap
> -------------------------------------
>
>                 Key: SPARK-27216
>                 URL: https://issues.apache.org/jira/browse/SPARK-27216
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.3, 2.4.0, 3.0.0
>            Reporter: Lantao Jin
>            Priority: Major
>
> HighlyCompressedMapStatus uses RoaringBitmap to record the empty blocks. But RoaringBitmap couldn't be ser/deser with unsafe KryoSerializer.
> We can use below UT to reproduce:
> {code}
>   test("kryo serialization with RoaringBitmap") {
>     val bitmap = new RoaringBitmap
>     bitmap.add(1787)
>     val safeSer = new KryoSerializer(conf).newInstance()
>     val bitmap2 : RoaringBitmap = safeSer.deserialize(safeSer.serialize(bitmap))
>     assert(bitmap2.equals(bitmap))
>     conf.set("spark.kryo.unsafe", "true")
>     val unsafeSer = new KryoSerializer(conf).newInstance()
>     val bitmap3 : RoaringBitmap = unsafeSer.deserialize(unsafeSer.serialize(bitmap))
>     assert(bitmap3.equals(bitmap)) // this will fail
>   }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org