You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Kaidi Zhao <kz...@salesforce.com> on 2019/04/17 05:02:53 UTC

HS2: Permission denied for my own table?

Hello!

Did I miss anything here or it is an known issue? Hive 1.2.1, hadoop 2.7.x,
kerberos, impersonation.

Using hive client, create a hive db and hive table. I can select from this
table correctly.
In hdfs, change the table folder's permission to be 711. In hive client, I
can still select from the table.
However, if using beeline client (which talks to HS2 I believe), it
complains about can't read the table folder in hdfs, something like:

Error: Error while compiling statement: FAILED: SemanticException Unable to
fetch table fact_app_logs. java.security.AccessControlException: Permission
denied: user=hive, access=READ,
inode="/data/mydb.db/my_table":myuser:mygroup:drwxr-x--x
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:307)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:220)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1736)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1710)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8220)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1932)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1455)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2218)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2214)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1760)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2212)
(state=42000,code=40000)

Note, from the log, it says it tries to use user "hive" (instead of my own
user "myuser") to read the table's folder (the folder is only readable by
its owner - myuser)
Again, using hive client I can read the table, but using beeline it can't.
If I change the folder's permission to 755, then it works.

Why beeline / HS2 needs to use "hive" to read the table's folder?

Thanks in advance.

Kaidi

Re: HS2: Permission denied for my own table?

Posted by Kaidi Zhao <kz...@salesforce.com>.
I checked that the impersonation setting is correct. Also not using SQL
authentication.
Side note: looks like HS2 really wants to read the table's folder using
user "hive" (just the folder) before it does anything else for the actual
files. In the setup as describe above, I can query the table from HS2 only
if I give a world readable permission (including user "hive") to the folder
only (the actual files under the folder is only readable by "me", and HS2
does not complain anything).


<http://smart.salesforce.com/sig/kzhao//us_mb/default/link.html>


On Wed, Apr 17, 2019 at 1:43 PM Dan Horne <da...@redbone.co.nz> wrote:

> Are you using SQL Authorisation? If you create tables using the hive cli,
> you won't be able to select the table from a connection  to the hive
> server.
>
> On Thu, 18 Apr 2019 at 04:34, Alan Gates <al...@gmail.com> wrote:
>
>> See
>> https://cwiki.apache.org/confluence/display/Hive/Setting+up+HiveServer2#SettingUpHiveServer2-Impersonation
>>
>> Alan.
>>
>> On Tue, Apr 16, 2019 at 10:03 PM Kaidi Zhao <kz...@salesforce.com> wrote:
>>
>>> Hello!
>>>
>>> Did I miss anything here or it is an known issue? Hive 1.2.1, hadoop
>>> 2.7.x, kerberos, impersonation.
>>>
>>> Using hive client, create a hive db and hive table. I can select from
>>> this table correctly.
>>> In hdfs, change the table folder's permission to be 711. In hive client,
>>> I can still select from the table.
>>> However, if using beeline client (which talks to HS2 I believe), it
>>> complains about can't read the table folder in hdfs, something like:
>>>
>>> Error: Error while compiling statement: FAILED: SemanticException Unable
>>> to fetch table fact_app_logs. java.security.AccessControlException:
>>> Permission denied: user=hive, access=READ,
>>> inode="/data/mydb.db/my_table":myuser:mygroup:drwxr-x--x
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:307)
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:220)
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1736)
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1710)
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8220)
>>> at
>>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1932)
>>> at
>>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1455)
>>> at
>>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>>> at
>>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>>> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2218)
>>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2214)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1760)
>>> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2212)
>>> (state=42000,code=40000)
>>>
>>> Note, from the log, it says it tries to use user "hive" (instead of my
>>> own user "myuser") to read the table's folder (the folder is only readable
>>> by its owner - myuser)
>>> Again, using hive client I can read the table, but using beeline it
>>> can't.
>>> If I change the folder's permission to 755, then it works.
>>>
>>> Why beeline / HS2 needs to use "hive" to read the table's folder?
>>>
>>> Thanks in advance.
>>>
>>> Kaidi
>>>
>>>
>>>

Re: HS2: Permission denied for my own table?

Posted by Dan Horne <da...@redbone.co.nz>.
Are you using SQL Authorisation? If you create tables using the hive cli,
you won't be able to select the table from a connection  to the hive
server.

On Thu, 18 Apr 2019 at 04:34, Alan Gates <al...@gmail.com> wrote:

> See
> https://cwiki.apache.org/confluence/display/Hive/Setting+up+HiveServer2#SettingUpHiveServer2-Impersonation
>
> Alan.
>
> On Tue, Apr 16, 2019 at 10:03 PM Kaidi Zhao <kz...@salesforce.com> wrote:
>
>> Hello!
>>
>> Did I miss anything here or it is an known issue? Hive 1.2.1, hadoop
>> 2.7.x, kerberos, impersonation.
>>
>> Using hive client, create a hive db and hive table. I can select from
>> this table correctly.
>> In hdfs, change the table folder's permission to be 711. In hive client,
>> I can still select from the table.
>> However, if using beeline client (which talks to HS2 I believe), it
>> complains about can't read the table folder in hdfs, something like:
>>
>> Error: Error while compiling statement: FAILED: SemanticException Unable
>> to fetch table fact_app_logs. java.security.AccessControlException:
>> Permission denied: user=hive, access=READ,
>> inode="/data/mydb.db/my_table":myuser:mygroup:drwxr-x--x
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:307)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:220)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1736)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1710)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8220)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1932)
>> at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1455)
>> at
>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>> at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2218)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2214)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1760)
>> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2212)
>> (state=42000,code=40000)
>>
>> Note, from the log, it says it tries to use user "hive" (instead of my
>> own user "myuser") to read the table's folder (the folder is only readable
>> by its owner - myuser)
>> Again, using hive client I can read the table, but using beeline it
>> can't.
>> If I change the folder's permission to 755, then it works.
>>
>> Why beeline / HS2 needs to use "hive" to read the table's folder?
>>
>> Thanks in advance.
>>
>> Kaidi
>>
>>
>>

Re: HS2: Permission denied for my own table?

Posted by Alan Gates <al...@gmail.com>.
See
https://cwiki.apache.org/confluence/display/Hive/Setting+up+HiveServer2#SettingUpHiveServer2-Impersonation

Alan.

On Tue, Apr 16, 2019 at 10:03 PM Kaidi Zhao <kz...@salesforce.com> wrote:

> Hello!
>
> Did I miss anything here or it is an known issue? Hive 1.2.1, hadoop
> 2.7.x, kerberos, impersonation.
>
> Using hive client, create a hive db and hive table. I can select from this
> table correctly.
> In hdfs, change the table folder's permission to be 711. In hive client, I
> can still select from the table.
> However, if using beeline client (which talks to HS2 I believe), it
> complains about can't read the table folder in hdfs, something like:
>
> Error: Error while compiling statement: FAILED: SemanticException Unable
> to fetch table fact_app_logs. java.security.AccessControlException:
> Permission denied: user=hive, access=READ,
> inode="/data/mydb.db/my_table":myuser:mygroup:drwxr-x--x
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:307)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:220)
> at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
> at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
> at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1736)
> at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1710)
> at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8220)
> at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1932)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1455)
> at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2218)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2214)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1760)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2212)
> (state=42000,code=40000)
>
> Note, from the log, it says it tries to use user "hive" (instead of my own
> user "myuser") to read the table's folder (the folder is only readable by
> its owner - myuser)
> Again, using hive client I can read the table, but using beeline it can't.
> If I change the folder's permission to 755, then it works.
>
> Why beeline / HS2 needs to use "hive" to read the table's folder?
>
> Thanks in advance.
>
> Kaidi
>
>
>