You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Guy Harmach <Gu...@Amdocs.com> on 2016/07/12 06:56:58 UTC

Spark streaming graceful shutdown when running on yarn-cluster deploy-mode

Hi,

I'm a newbie to spark, starting to work with Spark 1.5 using the Java API (about to upgrade to 1.6 soon).
I am deploying a spark streaming application using spark-submit with yarn-cluster mode.
What is the recommended way for performing graceful shutdown to the spark job?

Already tried using the spark.streaming.stopGracefullyOnShutdown configuration, adding a shutdown hook method, implementing the onStop() method.
I see at the logs that the hook methods are called and the configuration is read, but the application is terminated immediately, and also it doesn't clean the spark staging directory on HDFS.

Thanks,
Guy


This message and the information contained herein is proprietary and confidential and subject to the Amdocs policy statement,
you may review at http://www.amdocs.com/email_disclaimer.asp