You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Pralabh Kumar (Jira)" <ji...@apache.org> on 2023/04/22 09:41:00 UTC

[jira] [Created] (SPARK-43235) ClientDistributedCacheManager doesn't set the LocalResourceVisibility.PRIVATE if isPublic throws exception

Pralabh Kumar created SPARK-43235:
-------------------------------------

             Summary: ClientDistributedCacheManager doesn't set the LocalResourceVisibility.PRIVATE if isPublic throws exception
                 Key: SPARK-43235
                 URL: https://issues.apache.org/jira/browse/SPARK-43235
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.4.0
            Reporter: Pralabh Kumar


Hi Spark Team .

Currently *ClientDistributedCacheManager* *getVisibility* methods checks whether resource visibility can be set to private or public. 

In order to set  *LocalResourceVisibility.PUBLIC* ,isPublic checks permission of all the ancestors directories for the executable directory . It goes till the root folder to check permission of all the parents (ancestorsHaveExecutePermissions) 

checkPermissionOfOther calls  FileStatus getFileStatus to check the permission .

If the   FileStatus getFileStatus throws exception Spark Submit fails . It didn't sets the permission to Private.

if (isPublic(conf, uri, statCache)) {
LocalResourceVisibility.PUBLIC
} else {
LocalResourceVisibility.PRIVATE
}

Generally if the user doesn't have permission to check for root folder (specifically in case of cloud file system(GCS)  (for the buckets)  , methods throws error IOException(Error accessing Bucket).

 

*Ideally if there is an error in isPublic , which means Spark isn't able to determine the execution permission of all the parents directory , it should set the LocalResourceVisibility.PRIVATE.  However, it currently throws an exception in isPublic and hence Spark Submit fails*

 

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org