You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ronald Chen (JIRA)" <ji...@apache.org> on 2015/06/09 18:41:01 UTC
[jira] [Comment Edited] (SPARK-4809) Improve Guava shading in Spark
[ https://issues.apache.org/jira/browse/SPARK-4809?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14579190#comment-14579190 ]
Ronald Chen edited comment on SPARK-4809 at 6/9/15 4:40 PM:
------------------------------------------------------------
OK, maybe I am pointing at the wrong issue. But what I see is {{spark-network-common_2.11-1.2.0.jar}} contains no Guava classes, where as {{spark-network-common_2.11-1.3.0.jar contains}}:
* com/google/common/base/Absent.class
* com/google/common/base/Function.class
* com/google/common/base/Optional$1$1.class
* com/google/common/base/Optional$1.class
* com/google/common/base/Optional.class
* com/google/common/base/Present.class
* com/google/common/base/Supplier.class
This will cause the class path to be non-deterministic if I include a conflicting version of Guava.
was (Author: pyrolistical):
OK, maybe I am pointing at the wrong issue. But what I see is in spark-network-common_2.11-1.2.0.jar contains no Guava classes, where as spark-network-common_2.11-1.3.0.jar contains:
* com/google/common/base/Absent.class
* com/google/common/base/Function.class
* com/google/common/base/Optional$1$1.class
* com/google/common/base/Optional$1.class
* com/google/common/base/Optional.class
* com/google/common/base/Present.class
* com/google/common/base/Supplier.class
This will cause the class path to be non-deterministic if I include a conflicting version of Guava.
> Improve Guava shading in Spark
> ------------------------------
>
> Key: SPARK-4809
> URL: https://issues.apache.org/jira/browse/SPARK-4809
> Project: Spark
> Issue Type: Improvement
> Affects Versions: 1.2.0
> Reporter: Marcelo Vanzin
> Assignee: Marcelo Vanzin
> Fix For: 1.3.0
>
>
> As part of SPARK-2848, we started shading Guava to help with projects that want to use Spark but use an incompatible version of Guava.
> The approach used there is a little sub-optimal, though. It makes it tricky, especially, to run unit tests in your project when those need to use spark-core APIs.
> We should make the shading more transparent so that it's easier to use spark-core, with or without an explicit Guava dependency.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org