You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hao Wang <wh...@gmail.com> on 2014/06/12 15:24:17 UTC

Spark 1.0.0 Standalone AppClient cannot connect Master

Hi, all

Why does the Spark 1.0.0 official doc remove how to build Spark with
corresponding Hadoop version?

It means that if I don't need to specify the Hadoop version with I build my
Spark 1.0.0 with `sbt/sbt assembly`?


Regards,
Wang Hao(王灏)

CloudTeam | School of Software Engineering
Shanghai Jiao Tong University
Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
Email:wh.sjtu@gmail.com

Re: Spark 1.0.0 Standalone AppClient cannot connect Master

Posted by Hao Wang <wh...@gmail.com>.
Hi, Andrew

Got it, Thanks!

Hao

Regards,
Wang Hao(王灏)

CloudTeam | School of Software Engineering
Shanghai Jiao Tong University
Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
Email:wh.sjtu@gmail.com


On Fri, Jun 13, 2014 at 12:42 AM, Andrew Or <an...@databricks.com> wrote:

> Hi Wang Hao,
>
> This is not removed. We moved it here:
> http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
> If you're building with SBT, and you don't specify the
> SPARK_HADOOP_VERSION, then it defaults to 1.0.4.
>
> Andrew
>
>
> 2014-06-12 6:24 GMT-07:00 Hao Wang <wh...@gmail.com>:
>
> Hi, all
>>
>> Why does the Spark 1.0.0 official doc remove how to build Spark with
>> corresponding Hadoop version?
>>
>> It means that if I don't need to specify the Hadoop version with I build
>> my Spark 1.0.0 with `sbt/sbt assembly`?
>>
>>
>> Regards,
>> Wang Hao(王灏)
>>
>> CloudTeam | School of Software Engineering
>> Shanghai Jiao Tong University
>> Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
>> Email:wh.sjtu@gmail.com
>>
>
>

Re: Spark 1.0.0 Standalone AppClient cannot connect Master

Posted by Andrew Or <an...@databricks.com>.
Hi Wang Hao,

This is not removed. We moved it here:
http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html
If you're building with SBT, and you don't specify the
SPARK_HADOOP_VERSION, then it defaults to 1.0.4.

Andrew


2014-06-12 6:24 GMT-07:00 Hao Wang <wh...@gmail.com>:

> Hi, all
>
> Why does the Spark 1.0.0 official doc remove how to build Spark with
> corresponding Hadoop version?
>
> It means that if I don't need to specify the Hadoop version with I build
> my Spark 1.0.0 with `sbt/sbt assembly`?
>
>
> Regards,
> Wang Hao(王灏)
>
> CloudTeam | School of Software Engineering
> Shanghai Jiao Tong University
> Address:800 Dongchuan Road, Minhang District, Shanghai, 200240
> Email:wh.sjtu@gmail.com
>