You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ranger.apache.org by "Henning Kropp (JIRA)" <ji...@apache.org> on 2016/01/25 09:39:39 UTC

[jira] [Updated] (RANGER-820) RangerHiveAuthorizer Ignores HDFS Policies for Creation of Objects

     [ https://issues.apache.org/jira/browse/RANGER-820?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Henning Kropp updated RANGER-820:
---------------------------------
    Description: 
RangerHiveAuthorizer uses method {{isURIAccessAllowed}} during the creation of new objects ({{.isEmpty(inputHObjs)}}) which relies solely on {{FileUtil}} and {{FileStatus}} to check whether the user has the required FS in the hierarchy rights or not.

If following best practices a folder is for example owned by hdfs and only the hdfs user is given RWX access it is impossible for any user to create an external table in that folder through HS2, even if given access privileges by Ranger policies.

*Resulting exception*:
{code}
Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [user] does not have [READ] privilege on [hdfs://path/...]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:249)
at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:779)
at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:574)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1116)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:110)
... 15 more
{code}

*Workaround*: Use Hive CLI

  was:
RangerHiveAuthorizer uses method {{isURIAccessAllowed}} during the creation of new objects ({{.isEmpty(inputHObjs)}}) which relies solely on {{FileUtil}} and {{FileStatus}} to check whether the user has the required FS in the hierarchy rights or not.

If following best practices a folder is for example owned by hdfs and only the hdfs user is given RWX access it is impossible for any user to create an external table in that folder through HS2, even if given access privileges by Ranger policies.

Resulting exception:
{{Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [user] does not have [READ] privilege on [hdfs://path/...]
at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:249)
at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:779)
at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:574)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1116)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:110)
... 15 more}}

Workaround: Use Hive CLI


> RangerHiveAuthorizer Ignores HDFS Policies for Creation of Objects
> ------------------------------------------------------------------
>
>                 Key: RANGER-820
>                 URL: https://issues.apache.org/jira/browse/RANGER-820
>             Project: Ranger
>          Issue Type: Bug
>          Components: plugins
>    Affects Versions: ranger
>         Environment: HiveServer2
>            Reporter: Henning Kropp
>
> RangerHiveAuthorizer uses method {{isURIAccessAllowed}} during the creation of new objects ({{.isEmpty(inputHObjs)}}) which relies solely on {{FileUtil}} and {{FileStatus}} to check whether the user has the required FS in the hierarchy rights or not.
> If following best practices a folder is for example owned by hdfs and only the hdfs user is given RWX access it is impossible for any user to create an external table in that folder through HS2, even if given access privileges by Ranger policies.
> *Resulting exception*:
> {code}
> Caused by: org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException: Permission denied: user [user] does not have [READ] privilege on [hdfs://path/...]
> at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:249)
> at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:779)
> at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:574)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:468)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:308)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1122)
> at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1116)
> at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:110)
> ... 15 more
> {code}
> *Workaround*: Use Hive CLI



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)