You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2014/11/12 09:06:33 UTC
[jira] [Commented] (SPARK-4359) Empty classifier in "avro-mapred"
is misinterpreted in SBT
[ https://issues.apache.org/jira/browse/SPARK-4359?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14207794#comment-14207794 ]
Andrew Or commented on SPARK-4359:
----------------------------------
Ok, I reverted commit https://github.com/apache/spark/commit/78887f94a0ae9cdcfb851910ab9c7d51a1ef2acb for branch-1.1 for now.
> Empty classifier in "avro-mapred" is misinterpreted in SBT
> ----------------------------------------------------------
>
> Key: SPARK-4359
> URL: https://issues.apache.org/jira/browse/SPARK-4359
> Project: Spark
> Issue Type: Bug
> Components: Build
> Affects Versions: 1.1.0, 1.2.0
> Reporter: Andrew Or
> Assignee: Andrew Or
> Priority: Critical
>
> In the parent pom, "avro.mapred.classifier" is set to "hadoop2" for Yarn but not otherwise set. As a result, when an application that uses "spark-hive_2.10" as a module is built with SBT, it will try to resolve a jar that is literally called the following:
> {code}
> [warn] ==== Maven Repository: tried
> [warn] http://repo1.maven.org/maven2/org/apache/avro/avro-mapred/1.7.6/avro-mapred-1.7.6-${avro.mapred.classifier}.jar
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] :: FAILED DOWNLOADS ::
> [warn] :: ^ see resolution messages for details ^ ::
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] :: org.apache.avro#avro-mapred;1.7.6!avro-mapred.jar
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> sbt.ResolveException: download failed: org.apache.avro#avro-mapred;1.7.6!avro-mapred.jar
> {code}
> This is because "avro.mapred.classifier" is not a variable according to SBT.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org