You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Rastogi, Pankaj" <pa...@verizon.com> on 2017/12/22 18:14:21 UTC

Re: [E] How to do stop streaming before the application got killed

You can add a shutdown hook to your JVM and request spark streaming context
to stop gracefully.

  /**
   * Shutdown hook to shutdown JVM gracefully
   * @param ssCtx
   */
  def addShutdownHook(ssCtx: StreamingContext) = {

    Runtime.getRuntime.addShutdownHook( new Thread() {

      override def run() = {

        println("In shutdown hook")
        // stop gracefully
        ssCtx.stop(true, true)
      }
    })
  }
}

Pankaj

On Fri, Dec 22, 2017 at 9:56 AM, Toy <no...@gmail.com> wrote:

> I'm trying to write a deployment job for Spark application. Basically the
> job will send yarn application --kill app_id to the cluster but after the
> application received the signal it dies without finishing whatever is
> processing or stopping the stream.
>
> I'm using Spark Streaming. What's the best way to stop Spark application
> so we won't lose any data.
>
>
>