You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jacek Laskowski <ja...@japila.pl> on 2016/07/30 08:57:25 UTC

[YARN] Question about ApplicationMaster's shutdown hook (priority)

Hi,

When ApplicationMaster runs it registers a shutdown hook [1] that
(quoting the comment [2] from the code):

> // This shutdown hook should run *after* the SparkContext is shut down.

And so it gets priority lower than SparkContext [3], i.e.

val priority = ShutdownHookManager.SPARK_CONTEXT_SHUTDOWN_PRIORITY - 1

But, reading ShutdownHookManager.addShutdownHook says [4]:

> Adds a shutdown hook with the given priority. Hooks with lower priority values run first.

My understanding is that one comment is no longer true (if it has ever been).

Please help me understand that part of the code. Thanks.

[1] https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L206
[2] https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L204
[3] https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L205
[4] https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala#L146-L147

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [YARN] Question about ApplicationMaster's shutdown hook (priority)

Posted by Sean Owen <so...@cloudera.com>.
Good catch, yes, it's the scaladoc of addShutdownHook that is wrong.
It says lower priority executes firs.t

The implementation seems to do the opposite. It uses a min queue of
shutdown hooks, but inverts the notion of ordering to execute higher
priority values first.

The constants and comments in ShutdownHookManager are consistent with
executing higher priority values first.

So I think you can fix the scaladoc. Other usages of priority seem
consistent with current behavior.



On Sat, Jul 30, 2016 at 1:57 AM, Jacek Laskowski <ja...@japila.pl> wrote:
> Hi,
>
> When ApplicationMaster runs it registers a shutdown hook [1] that
> (quoting the comment [2] from the code):
>
>> // This shutdown hook should run *after* the SparkContext is shut down.
>
> And so it gets priority lower than SparkContext [3], i.e.
>
> val priority = ShutdownHookManager.SPARK_CONTEXT_SHUTDOWN_PRIORITY - 1
>
> But, reading ShutdownHookManager.addShutdownHook says [4]:
>
>> Adds a shutdown hook with the given priority. Hooks with lower priority values run first.
>
> My understanding is that one comment is no longer true (if it has ever been).
>
> Please help me understand that part of the code. Thanks.
>
> [1] https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L206
> [2] https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L204
> [3] https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L205
> [4] https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala#L146-L147
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org