You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bernardo Gomez Palacio (JIRA)" <ji...@apache.org> on 2014/06/05 02:00:05 UTC
[jira] [Created] (SPARK-2026) Maven "hadoop*" Profiles Should Set
the expected Hadoop Version.
Bernardo Gomez Palacio created SPARK-2026:
---------------------------------------------
Summary: Maven "hadoop*" Profiles Should Set the expected Hadoop Version.
Key: SPARK-2026
URL: https://issues.apache.org/jira/browse/SPARK-2026
Project: Spark
Issue Type: Improvement
Components: Build
Affects Versions: 1.0.0
Reporter: Bernardo Gomez Palacio
The Maven Profiles that refer to _hadoopX_, e.g. hadoop2.4, should set the expected _hadoop.version_.
e.g.
{code}
<profile>
<id>hadoop-2.4</id>
<properties>
<protobuf.version>2.5.0</protobuf.version>
<jets3t.version>0.9.0</jets3t.version>
</properties>
</profile>
{code}
as it is suggested
{code}
<profile>
<id>hadoop-2.4</id>
<properties>
<hadoop.version>2.4.0</hadoop.version>
<yarn.version>${hadoop.version}</yarn.version>
<protobuf.version>2.5.0</protobuf.version>
<jets3t.version>0.9.0</jets3t.version>
</properties>
</profile>
{code}
Builds can still define the -Dhadoop.version option but this will correctly default the Hadoop Version to the one that is expected according the profile that is selected.
e.g.
{code}
$ mvn -P hadoop-2.4,yarn clean compile
{code}
--
This message was sent by Atlassian JIRA
(v6.2#6252)