You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Robert James <sr...@gmail.com> on 2014/07/07 15:00:59 UTC

spark-submit conflicts with dependencies

When I use spark-submit (along with spark-ec2), I get dependency
conflicts.  spark-assembly includes older versions of apache commons
codec and httpclient, and these conflict with many of the libs our
software uses.

Is there any way to resolve these?  Or, if we use the precompiled
spark, can we simply not use newer versions of these libs, or anything
that depends on them?

Re: spark-submit conflicts with dependencies

Posted by Sean Owen <so...@cloudera.com>.
Normally, if this were all in one app, Maven would have solved the
problem for you by choosing 1.8 over 1.6. You do not need to exclude
anything; Maven does it for you.

Here the problem is that 1.8 is in the app but the server (Spark) uses
1.6. This is what the userClassPathFirst setting is for, to let 1.8
take precedence.

On Wed, Jan 28, 2015 at 5:48 AM, Ted Yu <yu...@gmail.com> wrote:
> You can add exclusion to Spark pom.xml
> Here is an example (for hadoop-client which brings in hadoop-common)
>
> diff --git a/pom.xml b/pom.xml
> index 05cb379..53947d9 100644
> --- a/pom.xml
> +++ b/pom.xml
> @@ -632,6 +632,10 @@
>          <scope>${hadoop.deps.scope}</scope>
>          <exclusions>
>            <exclusion>
> +            <groupId>commons-configuration</groupId>
> +            <artifactId>commons-configuration</artifactId>
> +          </exclusion>
> +          <exclusion>
>              <groupId>asm</groupId>
>              <artifactId>asm</artifactId>
>            </exclusion>
>
> On Tue, Jan 27, 2015 at 9:33 PM, soid <so...@dicefield.com> wrote:
>>
>> I have the same problem too.
>> org.apache.hadoop:hadoop-common:jar:2.4.0 brings
>> commons-configuration:commons-configuration:jar:1.6 but we're using
>> commons-configuration:commons 1.8
>> Is there any workaround for this?
>>
>> Greg
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-conflicts-with-dependencies-tp8909p21399.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: spark-submit conflicts with dependencies

Posted by Ted Yu <yu...@gmail.com>.
You can add exclusion to Spark pom.xml
Here is an example (for hadoop-client which brings in hadoop-common)

diff --git a/pom.xml b/pom.xml
index 05cb379..53947d9 100644
--- a/pom.xml
+++ b/pom.xml
@@ -632,6 +632,10 @@
         <scope>${hadoop.deps.scope}</scope>
         <exclusions>
           <exclusion>
+            <groupId>commons-configuration</groupId>
+            <artifactId>commons-configuration</artifactId>
+          </exclusion>
+          <exclusion>
             <groupId>asm</groupId>
             <artifactId>asm</artifactId>
           </exclusion>

On Tue, Jan 27, 2015 at 9:33 PM, soid <so...@dicefield.com> wrote:

> I have the same problem too.
> org.apache.hadoop:hadoop-common:jar:2.4.0 brings
> commons-configuration:commons-configuration:jar:1.6 but we're using
> commons-configuration:commons 1.8
> Is there any workaround for this?
>
> Greg
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-conflicts-with-dependencies-tp8909p21399.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: spark-submit conflicts with dependencies

Posted by soid <so...@dicefield.com>.
I have the same problem too. 
org.apache.hadoop:hadoop-common:jar:2.4.0 brings
commons-configuration:commons-configuration:jar:1.6 but we're using
commons-configuration:commons 1.8
Is there any workaround for this?

Greg



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-conflicts-with-dependencies-tp8909p21399.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org