You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Russell Spitzer (JIRA)" <ji...@apache.org> on 2018/01/08 23:53:00 UTC

[jira] [Commented] (SPARK-22976) Worker cleanup can remove running driver directories

    [ https://issues.apache.org/jira/browse/SPARK-22976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16317367#comment-16317367 ] 

Russell Spitzer commented on SPARK-22976:
-----------------------------------------

Made a PR against 2.0 but it's valid against all versions up to master

> Worker cleanup can remove running driver directories
> ----------------------------------------------------
>
>                 Key: SPARK-22976
>                 URL: https://issues.apache.org/jira/browse/SPARK-22976
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Spark Core
>    Affects Versions: 1.0.2
>            Reporter: Russell Spitzer
>
> Spark Standalone worker cleanup finds directories to remove with a listFiles command
> This includes both application directories and driver directories from cluster mode submitted applications. 
> A directory is considered to not be part of a running app if the worker does not have an executor with a matching ID.
> https://github.com/apache/spark/blob/v2.2.1/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala#L432
> {code}
>       val appIds = executors.values.map(_.appId).toSet
>       val isAppStillRunning = appIds.contains(appIdFromDir)
> {code}
> If a driver has been started on a node, but all of the executors are on other nodes, the worker running the driver will always assume that the driver directory is not part of a running app.
> Consider a two node spark cluster with Worker A and Worker B where each node has a single core available. We submit our application in deploy mode cluster, the driver begins running on Worker A while the Executor starts on B.
> Worker A has a cleanup triggered and looks and finds it has a directory
> {code}
> /var/lib/spark/worker/driver-20180105234824-0000
> {code}
> Worker A check's it's executor list and finds no entries which match this since it has no corresponding executors for this application. Worker A then removes the directory even though it may still be actively running.
> I think this could be fixed by modifying line 432 to be
> {code}
>       val appIds = executors.values.map(_.appId).toSet ++ drivers.values.map(_.driverId)
> {code}
> I'll run a test and submit a PR soon.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org