You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Corey Nolet <cj...@gmail.com> on 2014/11/14 16:43:49 UTC

Spark & Hadoop 2.5.1

I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x is
the current stable Hadoop 2.x, would it make sense for us to update the
poms?

Re: Spark & Hadoop 2.5.1

Posted by sa...@cloudera.com.
You're the second person to request this today. Planning to include this in my PR for Spark-4338.

-Sandy

> On Nov 14, 2014, at 8:48 AM, Corey Nolet <cj...@gmail.com> wrote:
> 
> In the past, I've built it by providing -Dhadoop.version=2.5.1 exactly like
> you've mentioned. What prompted me to write this email was that I did not
> see any documentation that told me Hadoop 2.5.1 was officially supported by
> Spark (i.e. community has been using it, any bugs are being fixed, etc...).
> It builds, tests pass, etc... but there could be other implications that I
> have not run into based on my own use of the framework.
> 
> If we are saying that the standard procedure is to build with the
> hadoop-2.4 profile and override the -Dhadoop.version property, should we
> provide that on the build instructions [1] at least?
> 
> [1] http://spark.apache.org/docs/latest/building-with-maven.html
> 
>> On Fri, Nov 14, 2014 at 10:46 AM, Sean Owen <so...@cloudera.com> wrote:
>> 
>> I don't think it's necessary. You're looking at the hadoop-2.4
>> profile, which works with anything >= 2.4. AFAIK there is no further
>> specialization needed beyond that. The profile sets hadoop.version to
>> 2.4.0 by default, but this can be overridden.
>> 
>>> On Fri, Nov 14, 2014 at 3:43 PM, Corey Nolet <cj...@gmail.com> wrote:
>>> I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x is
>>> the current stable Hadoop 2.x, would it make sense for us to update the
>>> poms?
>> 

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark & Hadoop 2.5.1

Posted by Sean Owen <so...@cloudera.com>.
Yeah I think someone even just suggested that today in a separate
thread? couldn't hurt to just add an example.

On Fri, Nov 14, 2014 at 4:48 PM, Corey Nolet <cj...@gmail.com> wrote:
> In the past, I've built it by providing -Dhadoop.version=2.5.1 exactly like
> you've mentioned. What prompted me to write this email was that I did not
> see any documentation that told me Hadoop 2.5.1 was officially supported by
> Spark (i.e. community has been using it, any bugs are being fixed, etc...).
> It builds, tests pass, etc... but there could be other implications that I
> have not run into based on my own use of the framework.
>
> If we are saying that the standard procedure is to build with the hadoop-2.4
> profile and override the -Dhadoop.version property, should we provide that
> on the build instructions [1] at least?
>
> [1] http://spark.apache.org/docs/latest/building-with-maven.html
>
> On Fri, Nov 14, 2014 at 10:46 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>> I don't think it's necessary. You're looking at the hadoop-2.4
>> profile, which works with anything >= 2.4. AFAIK there is no further
>> specialization needed beyond that. The profile sets hadoop.version to
>> 2.4.0 by default, but this can be overridden.
>>
>> On Fri, Nov 14, 2014 at 3:43 PM, Corey Nolet <cj...@gmail.com> wrote:
>> > I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x
>> > is
>> > the current stable Hadoop 2.x, would it make sense for us to update the
>> > poms?
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark & Hadoop 2.5.1

Posted by Corey Nolet <cj...@gmail.com>.
In the past, I've built it by providing -Dhadoop.version=2.5.1 exactly like
you've mentioned. What prompted me to write this email was that I did not
see any documentation that told me Hadoop 2.5.1 was officially supported by
Spark (i.e. community has been using it, any bugs are being fixed, etc...).
It builds, tests pass, etc... but there could be other implications that I
have not run into based on my own use of the framework.

If we are saying that the standard procedure is to build with the
hadoop-2.4 profile and override the -Dhadoop.version property, should we
provide that on the build instructions [1] at least?

[1] http://spark.apache.org/docs/latest/building-with-maven.html

On Fri, Nov 14, 2014 at 10:46 AM, Sean Owen <so...@cloudera.com> wrote:

> I don't think it's necessary. You're looking at the hadoop-2.4
> profile, which works with anything >= 2.4. AFAIK there is no further
> specialization needed beyond that. The profile sets hadoop.version to
> 2.4.0 by default, but this can be overridden.
>
> On Fri, Nov 14, 2014 at 3:43 PM, Corey Nolet <cj...@gmail.com> wrote:
> > I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x is
> > the current stable Hadoop 2.x, would it make sense for us to update the
> > poms?
>

Re: Spark & Hadoop 2.5.1

Posted by Sean Owen <so...@cloudera.com>.
I don't think it's necessary. You're looking at the hadoop-2.4
profile, which works with anything >= 2.4. AFAIK there is no further
specialization needed beyond that. The profile sets hadoop.version to
2.4.0 by default, but this can be overridden.

On Fri, Nov 14, 2014 at 3:43 PM, Corey Nolet <cj...@gmail.com> wrote:
> I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x is
> the current stable Hadoop 2.x, would it make sense for us to update the
> poms?

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org