You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2022/10/12 17:46:00 UTC

[jira] [Updated] (SPARK-40685) Update kryo transitive dependency to 5.2.0 or later

     [ https://issues.apache.org/jira/browse/SPARK-40685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen updated SPARK-40685:
---------------------------------
    Issue Type: Improvement  (was: Bug)
      Priority: Minor  (was: Major)

Does kryo 5 work with Spark? are there updated 4.x versions that resolve this too?

> Update kryo transitive dependency to 5.2.0 or later
> ---------------------------------------------------
>
>                 Key: SPARK-40685
>                 URL: https://issues.apache.org/jira/browse/SPARK-40685
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.3.0
>            Reporter: Andrew Kyle Purtell
>            Priority: Minor
>
> Spark 3.3 currently ships with kryo-shaded-4.0.2.jar, subject to [kryo#829|[https://github.com/EsotericSoftware/kryo/issues/829],] detected by several flavors of static vulnerability assessment tools, as a medium scored problem. 
> Kryo versions 5.2.0 or later have a fix for this issue.
> {noformat}
> [INFO] org.apache.spark:spark-unsafe_2.12:jar:3.3.2-SNAPSHOT
> [INFO] +- com.twitter:chill_2.12:jar:0.10.0:compile
> [INFO] |  \- com.esotericsoftware:kryo-shaded:jar:4.0.2:compile
> {noformat}
> {noformat}
> [INFO] org.apache.spark:spark-core_2.12:jar:3.3.2-SNAPSHOT
> [INFO] +- com.twitter:chill_2.12:jar:0.10.0:compile
> [INFO] |  \- com.esotericsoftware:kryo-shaded:jar:4.0.2:compile
> {noformat}
> This issue is not meant to imply a security problem in Spark itself.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org