You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by MEETHU MATHEW <me...@yahoo.co.in> on 2014/06/03 09:38:34 UTC

Upgradation to Spark 1.0.0

Hi ,

I am currently using SPARK 0.9 configured with the Hadoop 1.2.1 cluster.What should I do if I want to upgrade it to spark 1.0.0?Do I need to download the latest version and replace the existing spark with new one and make the configuration changes again from the scratch or is there any other way to move ahead?

Thanks,
Meethu M

Re: Upgradation to Spark 1.0.0

Posted by Matei Zaharia <ma...@gmail.com>.
You can copy your configuration from the old one. I’d suggest just downloading it to a different location on each node first for testing, then you can delete the old one if things work.


On Jun 3, 2014, at 12:38 AM, MEETHU MATHEW <me...@yahoo.co.in> wrote:

> Hi ,
> 
> I am currently using SPARK 0.9 configured with the Hadoop 1.2.1 cluster.What should I do if I want to upgrade it to spark 1.0.0?Do I need to download the latest version and replace the existing spark with new one and make the configuration changes again from the scratch or is there any other way to move ahead?
> 
> Thanks,
> Meethu M
> 
> 
>