You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Devl Devel <de...@gmail.com> on 2015/06/03 11:32:34 UTC

Stop Master and Slaves without SSH

Hey All,

start-slaves.sh and stop-slaves.sh make use of SSH to connect to remote
clusters. Are there alternative methods to do this without SSH?

For example using:

./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

is fine but there is no way to kill the Worker without using
stop-slave(s).sh or using ps -ef and then kill.

Are there alternatives available such as Hadoop's: hadoop-daemon.sh
start|stop xyz?

I noticed spark-daemon.sh exists but maybe we need to increase the
documentation around it, for instance:

 Usage: spark-daemon.sh [--config <conf-dir>] (start|stop|status)
<spark-command> <spark-instance-number> <args>

what are the valid spark-commands? Can this be used to start and stop
workers on the current node?

Many thanks
Devl