You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by stanley <wa...@yahoo.com> on 2014/09/12 16:59:11 UTC

How to initiate a shutdown of Spark Streaming context?

In  spark streaming programming document
<https://spark.apache.org/docs/latest/streaming-programming-guide.html>  ,
it specifically states how to shut down a spark streaming context: 

The existing application is shutdown gracefully (see
StreamingContext.stop(...) or JavaStreamingContext.stop(...) for graceful
shutdown options) which ensure data that have been received is completely
processed before shutdown. 

However, my question is, how do I initiate a shut down? Assume I am
upgrading a running Spark streaming system, how do I send a message to the
running spark streaming instance so that the call StreamingContext.stop(...)
is made?

Thanks,

Stanley



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-initiate-a-shutdown-of-Spark-Streaming-context-tp14092.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How to initiate a shutdown of Spark Streaming context?

Posted by Jeoffrey Lim <je...@gmail.com>.
What we did for gracefully shutting down the spark streaming context is
extend a Spark Web UI Tab and perform a
SparkContext.SparkUI.attachTab(<custom web ui>). However, the custom scala
Web UI extensions needs to be under the package org.apache.spark.ui to get
around with the package access restrictions.

Would it be possible that the SparkUI under SparkContext, and Spark Web UI
packages exposed as public so that developers may be able to add
customizations with their own tools?

Thanks!

On Tue, Sep 16, 2014 at 12:34 AM, stanley [via Apache Spark User List] <
ml-node+s1001560n14252h43@n3.nabble.com> wrote:

> Thank you.
>
> Would the following approaches to address this problem an overkills?
>
> a. create a ServerSocket in a different thread from the main thread that
> created the Spark StreamingContext, and listens to shutdown command there
> b. create a web service that wraps around the main thread that created the
> Spark StreamingContext, and responds to shutdown requests
>
> Does Spark Streaming already provide similar capabilities?
>
> Stanley
>
> ------------------------------
>  If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-initiate-a-shutdown-of-Spark-Streaming-context-tp14092p14252.html
>  To start a new topic under Apache Spark User List, email
> ml-node+s1001560n1h62@n3.nabble.com
> To unsubscribe from Apache Spark User List, click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=1&code=amVvZmZyZXlsQGdtYWlsLmNvbXwxfDUzNTE3MDc2OQ==>
> .
> NAML
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>
>




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-initiate-a-shutdown-of-Spark-Streaming-context-tp14092p14277.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: How to initiate a shutdown of Spark Streaming context?

Posted by stanley <wa...@yahoo.com>.
Thank you. 

Would the following approaches to address this problem an overkills?

a. create a ServerSocket in a different thread from the main thread that
created the Spark StreamingContext, and listens to shutdown command there
b. create a web service that wraps around the main thread that created the
Spark StreamingContext, and responds to shutdown requests

Does Spark Streaming already provide similar capabilities? 

Stanley



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-initiate-a-shutdown-of-Spark-Streaming-context-tp14092p14252.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How to initiate a shutdown of Spark Streaming context?

Posted by Sean Owen <so...@cloudera.com>.
Your app is the running Spark Streaming system. It would be up to you
to build some mechanism that lets you cause it to call stop() in
response to some signal from you.

On Fri, Sep 12, 2014 at 3:59 PM, stanley <wa...@yahoo.com> wrote:
> In  spark streaming programming document
> <https://spark.apache.org/docs/latest/streaming-programming-guide.html>  ,
> it specifically states how to shut down a spark streaming context:
>
> The existing application is shutdown gracefully (see
> StreamingContext.stop(...) or JavaStreamingContext.stop(...) for graceful
> shutdown options) which ensure data that have been received is completely
> processed before shutdown.
>
> However, my question is, how do I initiate a shut down? Assume I am
> upgrading a running Spark streaming system, how do I send a message to the
> running spark streaming instance so that the call StreamingContext.stop(...)
> is made?
>
> Thanks,
>
> Stanley
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-initiate-a-shutdown-of-Spark-Streaming-context-tp14092.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org