You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/06/30 12:01:52 UTC

[GitHub] [spark] attilapiros commented on pull request #28940: [SPARK-32121][SHUFFLE][TEST] Fix ExternalShuffleBlockResolverSuite failed on Windows

attilapiros commented on pull request #28940:
URL: https://github.com/apache/spark/pull/28940#issuecomment-651747960


   What about the following?
   
   As `createNormalizedInternedPathname` is only used here in production:
   https://github.com/apache/spark/blob/50a742da1c48bcfb6f69b319c9a0142801c959c6/common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExecutorDiskUtils.java#L48-L49
    
   What about creating the file with the constructor:
   https://github.com/openjdk/jdk/blob/9a9add8825a040565051a09010b29b099c2e7d49/jdk/src/share/classes/java/io/File.java#L275-L281:
   ```java
       public File(String pathname) {
           if (pathname == null) {
               throw new NullPointerException();
           }
           this.path = fs.normalize(pathname);
           this.prefixLength = fs.prefixLength(this.path);
       }
   ```
   
   Then read the path via `File#getPath` back and build the `File` again with the the path where the `String#intern` called. As we already doing path transformations twice this new solution is not worse than the current implementation but at least it will be a perfect match regarding the normalized path. The constructed first `File` will be garbage collected (this might be a disadvantage). Moreover we can get rid of the unnecessary tests.
   
   @Ngone51 @HyukjinKwon what's your opinion?
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org