You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jesper Lundgren <ko...@gmail.com> on 2014/11/16 09:46:13 UTC

How to kill/upgrade/restart driver launched in Spark standalone cluster+supervised mode?

Hello,

I have a Spark Standalone cluster running in HA mode. I launched a
application using spark-submit with cluster and supervised mode enabled and
it launched sucessfully on one of the worker nodes.

How can I stop/restart/kill or otherwise manage such task running in a
standalone cluster? Seems there is no options in the web interface. I
wonder how I can upgrade my driver in the future.

Also, does supervised mode work across worker nodes? IE will it relaunch on
another node if the current one dies or does it only handle restart on same
node after driver crash?

I would love to hear others experience with this :)

Thanks!

(PS i am launching a Spark Streaming application)

// Jesper Lundgren