You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by durga <du...@gmail.com> on 2014/07/23 02:37:10 UTC

How could I start new spark cluster with hadoop2.0.2

Hi,

I am trying to create spark cluster using spark-ec2 file under spark1.0.1
directory.

1) I noticed that It is always creating hadoop version 1.0.4.Is there a way
I can override that?I would like to have hadoop2.0.2

2) I also wants install Oozie along with. Is there any scrips available
along with spark-ec2, which can create oozie instances for me.

Thanks,
D.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-could-I-start-new-spark-cluster-with-hadoop2-0-2-tp10450.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: How could I start new spark cluster with hadoop2.0.2

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
There is no --hadoop-minor-version, but you can try
--hadoop-major-version=2.0.2
Does it break anything if you use 2.0.0 version of hadoop?

Thanks
Best Regards

On Wed, Oct 8, 2014 at 8:44 PM, st553 <st...@gmail.com> wrote:

> Hi,
>
> Were you able to figure out how to choose a specific version? Im having the
> same issue.
>
> Thanks.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-could-I-start-new-spark-cluster-with-hadoop2-0-2-tp10450p15939.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: How could I start new spark cluster with hadoop2.0.2

Posted by st553 <st...@gmail.com>.
Hi,

Were you able to figure out how to choose a specific version? Im having the
same issue. 

Thanks.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-could-I-start-new-spark-cluster-with-hadoop2-0-2-tp10450p15939.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How could I start new spark cluster with hadoop2.0.2

Posted by durga <du...@gmail.com>.
Hi 

It seems I can only give --hadoop-major-version=2 . it is taking 2.0.0. 
How could I say it should use 2.0.2
is there any --hadoop-minor-version variable I can use?

Thanks,
D.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-could-I-start-new-spark-cluster-with-hadoop2-0-2-tp10450p10517.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: How could I start new spark cluster with hadoop2.0.2

Posted by durga <du...@gmail.com>.
Thanks Akhil



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-could-I-start-new-spark-cluster-with-hadoop2-0-2-tp10450p10514.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: How could I start new spark cluster with hadoop2.0.2

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
AFAIK you can use the --hadoop-major-version parameter with the spark-ec2
<https://github.com/apache/spark/blob/master/ec2/spark_ec2.py> script to
switch the hadoop version.

Thanks
Best Regards


On Wed, Jul 23, 2014 at 6:07 AM, durga <du...@gmail.com> wrote:

> Hi,
>
> I am trying to create spark cluster using spark-ec2 file under spark1.0.1
> directory.
>
> 1) I noticed that It is always creating hadoop version 1.0.4.Is there a
> way
> I can override that?I would like to have hadoop2.0.2
>
> 2) I also wants install Oozie along with. Is there any scrips available
> along with spark-ec2, which can create oozie instances for me.
>
> Thanks,
> D.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-could-I-start-new-spark-cluster-with-hadoop2-0-2-tp10450.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>