You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Peter Vary (JIRA)" <ji...@apache.org> on 2019/01/07 14:34:00 UTC

[jira] [Updated] (HIVE-20914) MRScratchDir permission denied when "hive.server2.enable.doAs", "hive.exec.submitviachild" are set to "true" and impersonated/proxy user is used

     [ https://issues.apache.org/jira/browse/HIVE-20914?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Peter Vary updated HIVE-20914:
------------------------------
       Resolution: Fixed
    Fix Version/s: 4.0.0
           Status: Resolved  (was: Patch Available)

Pushed to master.

Thanks for the patch [~dkuzmenko] and [~szita] for the review!

> MRScratchDir permission denied when "hive.server2.enable.doAs", "hive.exec.submitviachild" are set to "true" and impersonated/proxy user is used
> ------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-20914
>                 URL: https://issues.apache.org/jira/browse/HIVE-20914
>             Project: Hive
>          Issue Type: Bug
>          Components: HiveServer2
>            Reporter: Denys Kuzmenko
>            Assignee: Denys Kuzmenko
>            Priority: Major
>             Fix For: 4.0.0
>
>         Attachments: HIVE-20914.1.patch, HIVE-20914.10.patch, HIVE-20914.2.patch, HIVE-20914.3.patch, HIVE-20914.4.patch, HIVE-20914.5.patch, HIVE-20914.6.patch, HIVE-20914.7.patch, HIVE-20914.8.patch, HIVE-20914.9.patch
>
>
> The above issue could be reproduced in none Kerberos cluster using the below steps:
> 1. Set "hive.exec.submitviachild" value to "true".
> 2. Run a count query not using "hive" user.
> {code}beeline -u 'jdbc:hive2://localhost:10000' -n hdfs{code}
> There is no issue when we try to execute the same query using the "hive" user.
> {code:java}
> Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=EXECUTE, inode="/tmp/hive/hdfs":hdfs:supergroup:drwx------ at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:201) at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:154) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3877) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3860) at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:3847) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:6822) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4551) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4529) at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4502) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:884) at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:328) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:641) at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275) at org.apache.hadoop.hive.ql.Context.getScratchDir(Context.java:285) at org.apache.hadoop.hive.ql.Context.getMRScratchDir(Context.java:328) at org.apache.hadoop.hive.ql.Context.getMRTmpPath(Context.java:444) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:243) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.main(ExecDriver.java:771) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)