You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Daniel Mahler <dm...@gmail.com> on 2014/05/19 01:56:30 UTC

making spark/conf/spark-defaults.conf changes take effect

I am running in an aws ec2 cluster that i launched using the spark-ec2
script that comes with spark
and I use the "-v master" option to run the head version.

If I then log into master and make changes to spark/conf/spark-defaults.conf
How do I make the changes take effect across the cluster?

Is just restarting spark-shell enough? (It does not seem to be)
Does  "~/spark/sbin/stop-all.sh ; sleep 5; ~/spark/sbin/start-all.sh" do it?
Do I need to copy the new spark-defaults.conf to all the slaves?
Or is there some command to sync everything?

thanks
Daniel

Re: making spark/conf/spark-defaults.conf changes take effect

Posted by Andrew Or <an...@databricks.com>.
Hm, it should just take effect immediately. But yes, there is a script for
syncing everything:

/root/spark-ec2/copy-dir --delete /root/spark

After that you should do

/root/spark/sbin/stop-all.sh
/root/spark/sbin/start-all.sh



2014-05-18 16:56 GMT-07:00 Daniel Mahler <dm...@gmail.com>:

>
> I am running in an aws ec2 cluster that i launched using the spark-ec2
> script that comes with spark
> and I use the "-v master" option to run the head version.
>
> If I then log into master and make changes
> to spark/conf/spark-defaults.conf
> How do I make the changes take effect across the cluster?
>
> Is just restarting spark-shell enough? (It does not seem to be)
> Does  "~/spark/sbin/stop-all.sh ; sleep 5; ~/spark/sbin/start-all.sh" do
> it?
> Do I need to copy the new spark-defaults.conf to all the slaves?
> Or is there some command to sync everything?
>
> thanks
> Daniel
>