You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by an...@apache.org on 2014/12/10 21:41:41 UTC

spark git commit: [SPARK-4771][Docs] Document standalone cluster supervise mode

Repository: spark
Updated Branches:
  refs/heads/master 0fc637b4c -> 56212831c


[SPARK-4771][Docs] Document standalone cluster supervise mode

tdas looks like streaming already refers to the supervise mode. The link from there is broken though.

Author: Andrew Or <an...@databricks.com>

Closes #3627 from andrewor14/document-supervise and squashes the following commits:

9ca0908 [Andrew Or] Wording changes
2b55ed2 [Andrew Or] Document standalone cluster supervise mode


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/56212831
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/56212831
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/56212831

Branch: refs/heads/master
Commit: 56212831c6436e287a19908e82c26117cbcb16b0
Parents: 0fc637b
Author: Andrew Or <an...@databricks.com>
Authored: Wed Dec 10 12:41:36 2014 -0800
Committer: Andrew Or <an...@databricks.com>
Committed: Wed Dec 10 12:41:36 2014 -0800

----------------------------------------------------------------------
 docs/spark-standalone.md | 11 ++++++++++-
 1 file changed, 10 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/56212831/docs/spark-standalone.md
----------------------------------------------------------------------
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index ae7b81d..5c6084f 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -257,7 +257,7 @@ To run an interactive Spark shell against the cluster, run the following command
 
 You can also pass an option `--total-executor-cores <numCores>` to control the number of cores that spark-shell uses on the cluster.
 
-# Launching Compiled Spark Applications
+# Launching Spark Applications
 
 The [`spark-submit` script](submitting-applications.html) provides the most straightforward way to
 submit a compiled Spark application to the cluster. For standalone clusters, Spark currently
@@ -272,6 +272,15 @@ should specify them through the `--jars` flag using comma as a delimiter (e.g. `
 To control the application's configuration or execution environment, see
 [Spark Configuration](configuration.html).
 
+Additionally, standalone `cluster` mode supports restarting your application automatically if it
+exited with non-zero exit code. To use this feature, you may pass in the `--supervise` flag to
+`spark-submit` when launching your application. Then, if you wish to kill an application that is
+failing repeatedly, you may do so through:
+
+    ./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>
+
+You can find the driver ID through the standalone Master web UI at `http://<master url>:8080`.
+
 # Resource Scheduling
 
 The standalone cluster mode currently only supports a simple FIFO scheduler across applications.


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org