You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nan Zhu (JIRA)" <ji...@apache.org> on 2016/03/29 21:40:25 UTC

[jira] [Comment Edited] (SPARK-14247) Spark does not compile with CDH-5.4.x due to the possible bug of ivy.....

    [ https://issues.apache.org/jira/browse/SPARK-14247?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15216719#comment-15216719 ] 

Nan Zhu edited comment on SPARK-14247 at 3/29/16 7:39 PM:
----------------------------------------------------------

thanks [~sowen], it seems that change the hadoop.version name solves the problem 




was (Author: codingcat):
thanks [~sowen], it seems that change the hadoop.version name solves the problem 

but...

the command "sbt -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.4.5 -DskipTests clean assembly"

gives me an assembly jar with the suffix 2.2.0 (testing in trunk)?

/Users/nanzhu/code/spark/assembly/target/scala-2.11/spark-assembly-2.0.0-SNAPSHOT-hadoop2.2.0.jar

seems it involves the issue belonging to another JIRA?


> Spark does not compile with CDH-5.4.x due to the possible bug of ivy.....
> -------------------------------------------------------------------------
>
>                 Key: SPARK-14247
>                 URL: https://issues.apache.org/jira/browse/SPARK-14247
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.6.0
>            Reporter: Nan Zhu
>            Priority: Minor
>
> I recently tried to compile Spark with CDH 5.4.x by the following command:
> sbt -Phadoop-2.6 -Dhadoop.version=2.6.0-mr1-cdh5.4.5 -DskipTests assembly
> I cannot finish the building due to an error saying that [error] impossible to get artifacts when data has not been loaded. IvyNode = org.slf4j#slf4j-api;1.6.1
> It seems that CDH depends on slf4j 1.6.x while Spark has upgraded to 1.7.x long time ago and during the compilation, slf4j 1.6.x is evicted unexpectedly (see the related discussion in https://github.com/sbt/sbt/issues/1598)
> I currently work around this by downgrade to slf4j 1.6.1....
> What surprises me is that I was upgrading from Spark 1.5.0 to 1.6.x, with 1.5.0, I can successfully compile Spark with the same version of CDH....
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org