You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/03/29 20:51:25 UTC

[jira] [Commented] (SPARK-14247) Spark does not compile with CDH-5.4.x due to the possible bug of ivy.....

    [ https://issues.apache.org/jira/browse/SPARK-14247?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15216628#comment-15216628 ] 

Sean Owen commented on SPARK-14247:
-----------------------------------

This is an Ivy / Maven problem, no?
You're also pointing at the fantastically archaic "MR1" build, though not sure if it matters. This is not the normal CDH artifact.
If it were something to do with the CDH versions, this also would not be a Spark issue.

> Spark does not compile with CDH-5.4.x due to the possible bug of ivy.....
> -------------------------------------------------------------------------
>
>                 Key: SPARK-14247
>                 URL: https://issues.apache.org/jira/browse/SPARK-14247
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.6.0
>            Reporter: Nan Zhu
>
> I recently tried to compile Spark with CDH 5.4.x by the following command:
> sbt -Phadoop-2.6 -Dhadoop.version=2.6.0-mr1-cdh5.4.5 -DskipTests assembly
> I cannot finish the building due to an error saying that [error] impossible to get artifacts when data has not been loaded. IvyNode = org.slf4j#slf4j-api;1.6.1
> It seems that CDH depends on slf4j 1.6.x while Spark has upgraded to 1.7.x long time ago and during the compilation, slf4j 1.6.x is evicted unexpectedly (see the related discussion in https://github.com/sbt/sbt/issues/1598)
> I currently work around this by downgrade to slf4j 1.6.1....
> What surprises me is that I was upgrading from Spark 1.5.0 to 1.6.x, with 1.5.0, I can successfully compile Spark with the same version of CDH....
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org