You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lantao Jin (JIRA)" <ji...@apache.org> on 2019/04/01 14:27:00 UTC

[jira] [Updated] (SPARK-27216) Upgrade RoaringBitmap to 0.7.45 to fix Kryo unsafe ser/dser issue

     [ https://issues.apache.org/jira/browse/SPARK-27216?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Lantao Jin updated SPARK-27216:
-------------------------------
    Summary: Upgrade RoaringBitmap to 0.7.45 to fix Kryo unsafe ser/dser issue  (was: Upgrade RoaringBitmap to 0.7.45)

> Upgrade RoaringBitmap to 0.7.45 to fix Kryo unsafe ser/dser issue
> -----------------------------------------------------------------
>
>                 Key: SPARK-27216
>                 URL: https://issues.apache.org/jira/browse/SPARK-27216
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.3, 2.4.0, 3.0.0
>            Reporter: Lantao Jin
>            Priority: Major
>
> HighlyCompressedMapStatus uses RoaringBitmap to record the empty blocks. But RoaringBitmap-0.5.11 couldn't be ser/deser with unsafe KryoSerializer.
> We can use below UT to reproduce:
> {code}
>   test("kryo serialization with RoaringBitmap") {
>     val bitmap = new RoaringBitmap
>     bitmap.add(1787)
>     val safeSer = new KryoSerializer(conf).newInstance()
>     val bitmap2 : RoaringBitmap = safeSer.deserialize(safeSer.serialize(bitmap))
>     assert(bitmap2.equals(bitmap))
>     conf.set("spark.kryo.unsafe", "true")
>     val unsafeSer = new KryoSerializer(conf).newInstance()
>     val bitmap3 : RoaringBitmap = unsafeSer.deserialize(unsafeSer.serialize(bitmap))
>     assert(bitmap3.equals(bitmap)) // this will fail
>   }
> {code}
> Upgrade to latest version 0.7.45 to fix it



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org