You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by "Ankit Singhal (Jira)" <ji...@apache.org> on 2020/01/23 20:09:00 UTC

[jira] [Updated] (PHOENIX-5691) create index is failing when phoenix acls enabled and ranger is enabled

     [ https://issues.apache.org/jira/browse/PHOENIX-5691?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ankit Singhal updated PHOENIX-5691:
-----------------------------------
    Summary: create index is failing when phoenix acls enabled and ranger is enabled  (was: create index is failing when phoenix acls enabled when ranger is enabled)

> create index is failing when phoenix acls enabled and ranger is enabled
> -----------------------------------------------------------------------
>
>                 Key: PHOENIX-5691
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-5691
>             Project: Phoenix
>          Issue Type: Bug
>            Reporter: Rajeshbabu Chintaguntla
>            Assignee: Rajeshbabu Chintaguntla
>            Priority: Major
>             Fix For: 5.1.0, 4.16.0
>
>         Attachments: PHOENIX-5691.patch
>
>
> create index failing with following exception when phoenix ACLs enabled.
> {noformat}
>   <property>
>     <name>phoenix.acls.enabled</name>
>     <value>true</value>
>   </property>
> {noformat}
> {noformat}
> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.ClassCastException: org.apache.hadoop.hbase.ipc.HBaseRpcControllerImpl cannot be cast to com.google.protobuf.RpcController
> 	at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:103)
> 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:603)
> 	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16537)
> 	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8305)
> 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2497)
> 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2479)
> 	at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42286)
> 	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:413)
> 	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:133)
> 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:338)
> 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:318)
> Caused by: java.lang.ClassCastException: org.apache.hadoop.hbase.ipc.HBaseRpcControllerImpl cannot be cast to com.google.protobuf.RpcController
> 	at org.apache.phoenix.coprocessor.PhoenixAccessController$3.getUserPermsFromUserDefinedAccessController(PhoenixAccessController.java:448)
> 	at org.apache.phoenix.coprocessor.PhoenixAccessController$3.run(PhoenixAccessController.java:431)
> 	at org.apache.phoenix.coprocessor.PhoenixAccessController$3.run(PhoenixAccessController.java:418)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876)
> 	at org.apache.hadoop.security.SecurityUtil.doAsUser(SecurityUtil.java:515)
> 	at org.apache.hadoop.security.SecurityUtil.doAsLoginUser(SecurityUtil.java:496)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.hadoop.hbase.util.Methods.call(Methods.java:40)
> 	at org.apache.hadoop.hbase.security.User.runAsLoginUser(User.java:192)
> 	at org.apache.phoenix.coprocessor.PhoenixAccessController.getUserPermissions(PhoenixAccessController.java:418)
> 	at org.apache.phoenix.coprocessor.PhoenixAccessController.requireAccess(PhoenixAccessController.java:498)
> 	at org.apache.phoenix.coprocessor.PhoenixAccessController.preGetTable(PhoenixAccessController.java:116)
> 	at org.apache.phoenix.coprocessor.PhoenixMetaDataCoprocessorHost$1.call(PhoenixMetaDataCoprocessorHost.java:157)
> 	at org.apache.phoenix.coprocessor.PhoenixMetaDataCoprocessorHost$1.call(PhoenixMetaDataCoprocessorHost.java:154)
> 	at org.apache.phoenix.coprocessor.PhoenixMetaDataCoprocessorHost$PhoenixObserverOperation.callObserver(PhoenixMetaDataCoprocessorHost.java:87)
> 	at org.apache.phoenix.coprocessor.PhoenixMetaDataCoprocessorHost.execOperation(PhoenixMetaDataCoprocessorHost.java:107)
> 	at org.apache.phoenix.coprocessor.PhoenixMetaDataCoprocessorHost.preGetTable(PhoenixMetaDataCoprocessorHost.java:154)
> 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:568)
> 	... 9 more
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.instantiateException(RemoteWithExtrasException.java:99)
> 	at org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.unwrapRemoteException(RemoteWithExtrasException.java:89)
> 	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.makeIOExceptionOfException(ProtobufUtil.java:282)
> 	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.handleRemoteException(ProtobufUtil.java:269)
> 	at org.apache.hadoop.hbase.client.RegionServerCallable.call(RegionServerCallable.java:129)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)
> 	at org.apache.hadoop.hbase.client.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:91)
> 	at org.apache.hadoop.hbase.client.SyncCoprocessorRpcChannel.callMethod(SyncCoprocessorRpcChannel.java:52)
> 	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDataProtos.java:16724)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1585)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1572)
> 	at org.apache.hadoop.hbase.client.HTable$12.call(HTable.java:979)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)