You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reza Safi (JIRA)" <ji...@apache.org> on 2018/12/07 22:31:00 UTC

[jira] [Commented] (SPARK-19526) Spark should raise an exception when it tries to read a Hive view but it doesn't have read access on the corresponding table(s)

    [ https://issues.apache.org/jira/browse/SPARK-19526?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16713368#comment-16713368 ] 

Reza Safi commented on SPARK-19526:
-----------------------------------

It seems that this can be resolved since we can't reproduce the issue. Spark will give an error message if the user doesn't have proper access to the underlying table of a view. It won't just return null results. Thank you [~attilapiros] and [~vanzin] for verifying this.

> Spark should raise an exception when it tries to read a Hive view but it doesn't have read access on the corresponding table(s)
> -------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19526
>                 URL: https://issues.apache.org/jira/browse/SPARK-19526
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.6.4, 2.0.3, 2.2.0, 2.3.0
>            Reporter: Reza Safi
>            Priority: Major
>
> Spark sees a Hive views as a set of hdfs "files". So to read anything from a Hive view, Spark needs access to all of the files that belongs to the table(s) that the view queries them.  In other words a Spark user cannot be granted fine grained permissions at the levels of Hive columns or records.
> Consider that there is a Spark job that contains a SQL query that tries to read a Hive view. Currently the Spark job will finish successfully if the user that runs the Spark job doesn't have proper read access permissions to the tables that the Hive view has been built on top of them. It will just return an empty result set. This can be confusing for the users, since the job will be finishes without any exception or error. 
> Spark should raise an exception like  AccessDenied when it tries to run a Hive view query and its user doesn't have proper permissions to the tables that the Hive view is created on top of them. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org