You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by "Chinmay Kulkarni (Jira)" <ji...@apache.org> on 2019/12/06 21:58:00 UTC

[jira] [Updated] (PHOENIX-5605) 4.14 Client can't add a column to a table on a 4.15 server

     [ https://issues.apache.org/jira/browse/PHOENIX-5605?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chinmay Kulkarni updated PHOENIX-5605:
--------------------------------------
    Fix Version/s: 4.15.0

> 4.14 Client can't add a column to a table on a 4.15 server
> ----------------------------------------------------------
>
>                 Key: PHOENIX-5605
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-5605
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.15.0
>            Reporter: Geoffrey Jacoby
>            Assignee: Chinmay Kulkarni
>            Priority: Blocker
>             Fix For: 4.15.0
>
>
> I took a fresh HBase cluster with the 4.15 Phoenix server, and connected to it for the first time with a 4.14 client, which created the 4.14-era system tables. I then ran an internal tool to apply the DDL for my use case.
> It was able to successfully create a non-system table, but I got a TableNotFoundException trying to add a column to that table in an ALTER statement. 
> {code:java}
> org.apache.hadoop.hbase.TableNotFoundException: org.apache.hadoop.hbase.TableNotFoundException: Table 'SYSTEM.CHILD_LINK' was not found, got: SYSTEM.CATALOG.
> 	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1357)
> 	at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1223)
> 	at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:357)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:162)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:231)
> 	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:273)
> 	at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:434)
> 	at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:308)
> 	at org.apache.phoenix.util.ViewUtil.findRelatedViews(ViewUtil.java:125)
> 	at org.apache.phoenix.util.ViewUtil.findAllRelatives(ViewUtil.java:96)
> 	at org.apache.phoenix.util.ViewUtil.findAllRelatives(ViewUtil.java:90)
> 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.findAllChildViews(MetaDataEndpointImpl.java:2119)
> 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.mutateColumn(MetaDataEndpointImpl.java:2508)
> 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addColumn(MetaDataEndpointImpl.java:2799)
> 	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:17248)
> 	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:8528)
> 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:2282)
> 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:2264)
> 	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:36808)
> 	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2399)
> 	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)
> 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:311)
> 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:291)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
> 	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
> 	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.makeIOExceptionOfException(ProtobufUtil.java:368)
> 	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:327)
> 	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1737)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:100)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:90)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:137)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:104)
> 	at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
> 	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.addColumn(MetaDataProtos.java:16679)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$10.call(ConnectionQueryServicesImpl.java:1961)
> 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$10.call(ConnectionQueryServicesImpl.java:1949)
> 	at org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1768)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748){code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)