You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "James Grinter (JIRA)" <ji...@apache.org> on 2018/11/20 11:22:00 UTC

[jira] [Commented] (SPARK-23897) Guava version

    [ https://issues.apache.org/jira/browse/SPARK-23897?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16693076#comment-16693076 ] 

James Grinter commented on SPARK-23897:
---------------------------------------

We also just bumped into CVE-2018-10237, as it's now started triggering the OWASP dependency checker in our Spark application builds because of the included Guava dependency.

But I'm going to note that the Guava code itself does not use `AtomicDoubleArray` (one of the problematic classes) internally, and instantiates a `CompoundOrdering` object only via its `Ordering` collection class and `compound` method.

Spark does not use `AtomicDoubleArray` but it *does* use `Ordering`. It doesn't invoke the `compound` method that would create a `CompoundOrdering` object.

Someone else has asked about this specific CVE at https://issues.apache.org/jira/browse/SPARK-25762

> Guava version
> -------------
>
>                 Key: SPARK-23897
>                 URL: https://issues.apache.org/jira/browse/SPARK-23897
>             Project: Spark
>          Issue Type: Dependency upgrade
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Sercan Karaoglu
>            Priority: Minor
>
> Guava dependency version 14 is pretty old, needs to be updated to at least 16, google cloud storage connector uses newer one which causes pretty popular error with guava; "java.lang.NoSuchMethodError: com.google.common.base.Splitter.splitToList(Ljava/lang/CharSequence;)Ljava/util/List;" and causes app to crash



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org