You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by XuTingjun <gi...@git.apache.org> on 2014/12/08 10:13:46 UTC

[GitHub] spark pull request: Add error message when making local dir unsucc...

GitHub user XuTingjun opened a pull request:

    https://github.com/apache/spark/pull/3635

    Add error message when making local dir unsuccessfully

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/XuTingjun/spark master

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/3635.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #3635
    
----
commit 1c51a0c78c8477f4aae83ec18212c773aed57701
Author: meiyoula <10...@qq.com>
Date:   2014-12-08T09:11:09Z

    Update DiskBlockManager.scala

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21526216
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,29 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
    +          var foundLocalDir = false
    +          var tries = 0
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          while (!foundLocalDir && tries < MAX_DIR_CREATION_ATTEMPTS) {
    +            tries += 1
    +            try {
    +              if (!newDir.exists()) {
    +                foundLocalDir = newDir.mkdir()
    --- End diff --
    
    Why keep retrying this? what's the case where it fails but then immediately succeeds?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21526232
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,29 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
    +          var foundLocalDir = false
    +          var tries = 0
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          while (!foundLocalDir && tries < MAX_DIR_CREATION_ATTEMPTS) {
    +            tries += 1
    +            try {
    +              if (!newDir.exists()) {
    +                foundLocalDir = newDir.mkdir()
    +              }
    +              else {
    +                foundLocalDir = true
    +              }
    +            } catch {
    +              case e: Exception =>
    +                logWarning(s"Attempt $tries to create local dir $newDir failed", e)
    +            }
    +          }
    +          if (!foundLocalDir) {
    +            throw new Exception(s"Failed $MAX_DIR_CREATION_ATTEMPTS attempts to create local dir in $newDir.")
    --- End diff --
    
    `IOException` is more appropriate.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by XuTingjun <gi...@git.apache.org>.
Github user XuTingjun commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-66283335
  
    @srowen , thank you for your suggestion, I have modified the method.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-66356636
  
      [Test build #24254 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24254/consoleFull) for   PR 3635 at commit [`2a55bc2`](https://github.com/apache/spark/commit/2a55bc227e1247a83d7a32b647667d44fcab9952).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/3635


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21879937
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,13 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          if (!newDir.exists() && !newDir.mkdir()) {
    --- End diff --
    
    @srowen 
    
    > One thing to note is that mkdir returns false if the directory already existed, which shouldn't be an error (?)
    
    I think that it _is_ an error in this context, though, since this branch is only supposed to be taken for subdirectories that don't exist.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by XuTingjun <gi...@git.apache.org>.
Github user XuTingjun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21876377
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,13 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          if (!newDir.exists() && !newDir.mkdir()) {
    --- End diff --
    
    After reading more details, i found this function is used by reading and writing.
    With writing, when the dir exists, I think it is ok to write files, it should not be an error.
    With reading, the dirs should be existing with shuffle datas and others, so when the old doesn't exist, it should throw an exception.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-67114246
  
      [Test build #24482 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24482/consoleFull) for   PR 3635 at commit [`dd1c66d`](https://github.com/apache/spark/commit/dd1c66de16154efe62242e7cd459fc26809124a6).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-66344031
  
      [Test build #24254 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24254/consoleFull) for   PR 3635 at commit [`2a55bc2`](https://github.com/apache/spark/commit/2a55bc227e1247a83d7a32b647667d44fcab9952).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21871477
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,13 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          if (!newDir.exists() && !newDir.mkdir()) {
    --- End diff --
    
    If execution reaches this branch, doesn't that already imply that `!newDir.exists()`?  Is there a reason that this patch can't be as simple as throwing an exception if `!newDir.mkdir()`?  After this check, `subDirs` will only contain the paths of directories that exist, so you won't need to perform a redundant `&& old.exists()` check up on line 70.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21879969
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,13 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          if (!newDir.exists() && !newDir.mkdir()) {
    --- End diff --
    
    So, yeah, I don't think that the `!newDir.exists()` check is strictly _necessary_ here, but I suppose that it might be an okay clarification of the intent of this branch, maybe.
    
    Therefore, I guess I'm fine merging this in its current form.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-67109651
  
      [Test build #24482 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/24482/consoleFull) for   PR 3635 at commit [`dd1c66d`](https://github.com/apache/spark/commit/dd1c66de16154efe62242e7cd459fc26809124a6).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-66342602
  
    Jenkins, this is ok to test.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-66356650
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24254/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by XuTingjun <gi...@git.apache.org>.
Github user XuTingjun commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-67105200
  
    @JoshRosen, If users share the cluster, and one unconsciously delete the dir, it will affect the application. So I think add exists() is more secure.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: Add error message when making local dir unsucc...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-66041481
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-67114251
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24482/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: Add error message when making local dir unsucc...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-66212364
  
    Mind creating a JIRA for this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21875371
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,13 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          if (!newDir.exists() && !newDir.mkdir()) {
    --- End diff --
    
    One thing to note is that `mkdir` returns `false` if the directory already existed, which shouldn't be an error (?). I'm not familiar enough with the code to know if this should ever be possible in normal operation. But maybe the defensive check here is fine.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/3635#issuecomment-67117844
  
    Alright, this seems fine to me.  I'm going to merge this into `master`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-4792] Add error message when making loc...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21527997
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,20 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
    +          var foundLocalDir = false
    --- End diff --
    
    OK, but this variable seems pointless now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: Add error message when making local dir unsucc...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3635#discussion_r21449898
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
    @@ -67,11 +67,14 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
         if (subDir == null) {
           subDir = subDirs(dirId).synchronized {
             val old = subDirs(dirId)(subDirId)
    -        if (old != null) {
    +        if (old != null && old.exists()) {
               old
             } else {
               val newDir = new File(localDirs(dirId), "%02x".format(subDirId))
    -          newDir.mkdir()
    +          val foundLocalDir = newDir.mkdir()
    +           if (!foundLocalDir) {
    --- End diff --
    
    Indent has one too many spaces. The message should probably be a warning. It says "ignoring this directory" but it doesn't seem to be ignored? You changed the semantics of the condition too, to replace a value that was a non-existent dir. That seems reasonable, but this can replace it with a directory that can't be created for some reason. Is this not an exception condition?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org