You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ranger.apache.org by "chuanjie.duan (JIRA)" <ji...@apache.org> on 2017/10/31 06:14:00 UTC
[jira] [Comment Edited] (RANGER-1865) hive plugin alter table add
partition failed HiveAccessControlException Permission denied: user does
not have [READ] privilege on location
[ https://issues.apache.org/jira/browse/RANGER-1865?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16226212#comment-16226212 ]
chuanjie.duan edited comment on RANGER-1865 at 10/31/17 6:13 AM:
-----------------------------------------------------------------
It is a external table, but i dont think it's about permission, it hasnt done authentication(I mean login). No login and cannot access namenode, Cannot access namenode and cannot check permission, so popup error(ConnectException: Call From hostname/ipaddress to hiveserver host:9000 failed on connection exception).
In my case, I use user:algo to add partition(hive database and table both create by algo) and location is hdfs://user/algo/tablename(user algo's home dir), so i think user:algo should have permission.
public static FileSystem get(final URI uri, final Configuration conf,
final String user) throws IOException, InterruptedException {
String ticketCachePath =
conf.get(CommonConfigurationKeys.KERBEROS_TICKET_CACHE_PATH);
UserGroupInformation ugi =
UserGroupInformation.getBestUGI(ticketCachePath, user);
....
}
ticketCachePath is null
public static UserGroupInformation getBestUGI(
String ticketCachePath, String user) throws IOException {
if (ticketCachePath != null) {
return getUGIFromTicketCache(ticketCachePath, user);
} else if (user == null) {
return getCurrentUser();
} else {
return createRemoteUser(user);
}
}
user is algo, ticketCachePath is null
public static UserGroupInformation createRemoteUser(String user) {
return createRemoteUser(user, AuthMethod.SIMPLE);
}
simple is wrong
was (Author: chuanjie.duan):
It is a external table, but i dont think it about permission, it hasnt done authentication(I mean login). No login and cannot access namenode, so popup error(ConnectException: Call From hostname/ipaddress to hiveserver host:9000 failed on connection exception). Futhermore, my hive table partition's location is user home dir, so the user should have right hdfs permission.
> hive plugin alter table add partition failed HiveAccessControlException Permission denied: user does not have [READ] privilege on location
> ------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: RANGER-1865
> URL: https://issues.apache.org/jira/browse/RANGER-1865
> Project: Ranger
> Issue Type: Bug
> Components: plugins
> Affects Versions: 0.6.3
> Reporter: chuanjie.duan
> Priority: Critical
> Labels: hive-agent
> Attachments: RANGER-1865.patch
>
>
> hive execute insert sql:alter table tablename add if not exists partition(yyyymmdd='20170911',ds='rcc_02') location 'hdfs://xxxx/yyyymmdd=20170911/ds=rcc_02'
> Client Log:
> org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: HiveAccessControlException Permission denied: user [username] does not have [READ] privilege on [hdfs://xxxx/yyyymmdd=20170911/ds=rcc_02]
> Hiveserver Log:
> 017-10-27 16:53:26,929 ERROR [HiveServer2-Handler-Pool: Thread-43]: authorizer.RangerHiveAuthorizer (RangerHiveAuthorizer.java:isURIAccessAllowed(1034)) - Error getting permissions for hdfs://xxxx/yyyymmdd=20170911/ds=rcc_02
> java.net.ConnectException: Call From hostname/ipaddress to hiveserver host:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
> at org.apache.hadoop.ipc.Client.call(Client.java:1480)
> at org.apache.hadoop.ipc.Client.call(Client.java:1407)
> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
> at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
> at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
> at sun.reflect.GeneratedMethodAccessor10.invoke(Unknown Source)
> Cause:
> Hive security enabled kerberos, hive plugin access hdfs should do authentication first.
>
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)