You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Stephen Coy (Jira)" <ji...@apache.org> on 2020/10/13 06:34:00 UTC

[jira] [Issue Comment Deleted] (SPARK-33090) Upgrade Google Guava

     [ https://issues.apache.org/jira/browse/SPARK-33090?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Stephen Coy updated SPARK-33090:
--------------------------------
    Comment: was deleted

(was: Created PR)

> Upgrade Google Guava
> --------------------
>
>                 Key: SPARK-33090
>                 URL: https://issues.apache.org/jira/browse/SPARK-33090
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 3.0.1
>            Reporter: Stephen Coy
>            Priority: Major
>
> Hadoop versions newer than 3.2.0 (such as 3.2.1 and 3.3.0) have started using features from newer versions of Google Guava.
> This leads to MethodNotFound exceptions, etc in Spark builds that specify newer versions of Hadoop. I believe this is due to the use of new methods in com.google.common.base.Preconditions.
> The above versions of Hadoop use guava-27.0-jre, whereas Spark is currently glued to guava-14.0.1.
> I have been running a Spark cluster with the version bumped to guava-29.0-jre without issue.
> Partly due to the way Spark is built, this change is a little more complicated that just changing the version, because newer versions of guava have a new dependency on com.google.guava:failureaccess:1.0.
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org