You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2017/01/08 09:29:06 UTC
spark git commit: [SPARK-19026] SPARK_LOCAL_DIRS(multiple directories
on different disks) cannot be deleted
Repository: spark
Updated Branches:
refs/heads/master 6b6b555a1 -> cd1d00ada
[SPARK-19026] SPARK_LOCAL_DIRS(multiple directories on different disks) cannot be deleted
JIRA Issue: https://issues.apache.org/jira/browse/SPARK-19026
SPARK_LOCAL_DIRS (Standalone) can be a comma-separated list of multiple directories on different disks, e.g. SPARK_LOCAL_DIRS=/dir1,/dir2,/dir3, if there is a IOExecption when create sub directory on dir3 , the sub directory which have been created successfully on dir1 and dir2 cannot be deleted anymore when the application finishes.
So we should catch the IOExecption at Utils.createDirectory , otherwise the variable "appDirectories(appId)" which the function maybeCleanupApplication calls will not be set then dir1 and dir2 will not be cleaned up .
Author: zuotingbing <zu...@zte.com.cn>
Closes #16439 from zuotingbing/master.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/cd1d00ad
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/cd1d00ad
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/cd1d00ad
Branch: refs/heads/master
Commit: cd1d00adaff65e8adfebc2342dd422c53f98166b
Parents: 6b6b555
Author: zuotingbing <zu...@zte.com.cn>
Authored: Sun Jan 8 09:29:01 2017 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Sun Jan 8 09:29:01 2017 +0000
----------------------------------------------------------------------
.../org/apache/spark/deploy/worker/Worker.scala | 25 +++++++++++++++-----
1 file changed, 19 insertions(+), 6 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/cd1d00ad/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala b/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
index f963a46..e48817e 100755
--- a/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala
@@ -445,12 +445,25 @@ private[deploy] class Worker(
// Create local dirs for the executor. These are passed to the executor via the
// SPARK_EXECUTOR_DIRS environment variable, and deleted by the Worker when the
// application finishes.
- val appLocalDirs = appDirectories.getOrElse(appId,
- Utils.getOrCreateLocalRootDirs(conf).map { dir =>
- val appDir = Utils.createDirectory(dir, namePrefix = "executor")
- Utils.chmod700(appDir)
- appDir.getAbsolutePath()
- }.toSeq)
+ val appLocalDirs = appDirectories.getOrElse(appId, {
+ val localRootDirs = Utils.getOrCreateLocalRootDirs(conf)
+ val dirs = localRootDirs.flatMap { dir =>
+ try {
+ val appDir = Utils.createDirectory(dir, namePrefix = "executor")
+ Utils.chmod700(appDir)
+ Some(appDir.getAbsolutePath())
+ } catch {
+ case e: IOException =>
+ logWarning(s"${e.getMessage}. Ignoring this directory.")
+ None
+ }
+ }.toSeq
+ if (dirs.isEmpty) {
+ throw new IOException("No subfolder can be created in " +
+ s"${localRootDirs.mkString(",")}.")
+ }
+ dirs
+ })
appDirectories(appId) = appLocalDirs
val manager = new ExecutorRunner(
appId,
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org