You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:25:16 UTC

[jira] [Updated] (SPARK-14956) Spark dependencies conflicts

     [ https://issues.apache.org/jira/browse/SPARK-14956?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-14956:
---------------------------------
    Labels: bulk-closed dependencies  (was: dependencies)

> Spark dependencies conflicts
> ----------------------------
>
>                 Key: SPARK-14956
>                 URL: https://issues.apache.org/jira/browse/SPARK-14956
>             Project: Spark
>          Issue Type: Brainstorming
>          Components: Spark Core
>    Affects Versions: 1.6.1
>            Reporter: Michel Lemay
>            Priority: Minor
>              Labels: bulk-closed, dependencies
>
> Since Spark lives with ages-old dependencies, it's all too often a problem that we must downgrade one of our dependencies just to make it "not explode" in Spark. And most importantly, this is something we only encounter at runtime and that makes it an even worse problem.
> So the usual solution when we depend directly on a package (but not one of our transitive dependencies) is to relocate with a shade plugin like this:
> {code}
> <plugin>
>   <groupId>org.apache.maven.plugins</groupId>
>   <artifactId>maven-shade-plugin</artifactId>
> ...
>   <configuration>
>     <relocations>
>       <relocation>
>         <pattern>scopt</pattern>
>         <shadedPattern>hidden.scopt</shadedPattern>
>       </relocation>
> {code}
> Other times, we must exclude transitive dependencies like that:
> {code}
>         <dependency>
>             <groupId>com.twitter.penguin</groupId>
>             <artifactId>korean-text</artifactId>
>             <version>4.1.2</version>
>             <exclusions>
>                 <exclusion>
>                     <groupId>org.slf4j</groupId>
>                     <artifactId>slf4j-nop</artifactId>
>                 </exclusion>
>             </exclusions>
>         </dependency>
> {code}
> Everything related to Guava, Log4j, Databind, scopt and even Java8 fall into this category.
> I wonder if that would be possible to use OSGi style of plugins when running code inside Spark..  That would shield us from all theses things.
> Could bill-of-material be of any help here as well?  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org