You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Peter Toth (Jira)" <ji...@apache.org> on 2019/10/02 20:16:00 UTC
[jira] [Comment Edited] (SPARK-29078) Spark shell fails if read
permission is not granted to hive warehouse directory
[ https://issues.apache.org/jira/browse/SPARK-29078?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16943127#comment-16943127 ]
Peter Toth edited comment on SPARK-29078 at 10/2/19 8:15 PM:
-------------------------------------------------------------
I don't think there should be other databases under {{/apps/hive/warehouse}} directory if the {{default}} database points to {{/apps/hive/warehouse}}. I mean that way we could avoid this issue.
was (Author: petertoth):
I don't think there should be other databases under {{/apps/hive/warehouse}} directory if the {{default}} database points to {{/apps/hive/warehouse}}.
> Spark shell fails if read permission is not granted to hive warehouse directory
> -------------------------------------------------------------------------------
>
> Key: SPARK-29078
> URL: https://issues.apache.org/jira/browse/SPARK-29078
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Mihaly Toth
> Priority: Major
>
> Similarly to SPARK-20256, in {{SharedSessionState}} when {{GlobalTempViewManager}} is created, it is checked that there is no database exists that has the same name as of the global temp database (name is configurable with {{spark.sql.globalTempDatabase}}) , because that is a special database, which should not exist in the metastore. For this, a read permission is required on the warehouse directory at the moment, which on the other hand would allow listing all the databases of all users.
> When such a read access is not granted for security reasons, an access violation exception should be ignored upon such initial validation.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org