You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Tanvi Bhandari <ta...@gmail.com> on 2018/09/11 16:32:45 UTC

Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Hi,



I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
(had optional concept of schema) to phoenix-4.14 (schema is a must in
here).

Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
to connect to phoenix using sqline client,  I get the following error on
*console*:



18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on IOException

org.apache.hadoop.hbase.DoNotRetryIOException:
org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM:CATALOG: 63

        at
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)

        at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3572)

        at
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)

        at
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)

        at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)

        at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)

        at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)

        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)

        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)

        at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)

        at
org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)

        at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.ArrayIndexOutOfBoundsException: 63

        at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)

        at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)

        at
org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)

        at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)

        at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)

       at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)

        at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)

        ... 10 more



        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)

        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)

        at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)

        at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)

        at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:326)

        at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1629)

        at
org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)

        at
org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)

        at
org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)

        at
org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)

        at
org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)

        at
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)

        at
org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1271)

        at
org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1263)

        at org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)

        at java.util.concurrent.FutureTask.run(FutureTask.java:266)

        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)

        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

        at java.lang.Thread.run(Thread.java:745)





*Region-server logs are as follows: *

2018-09-07 03:23:36,170 ERROR
[B.defaultRpcServer.handler=1,queue=1,port=29062]
coprocessor.MetaDataEndpointImpl: loading system catalog table inside
getVersion failed

java.lang.ArrayIndexOutOfBoundsException: 63

               at
org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)

               at
org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)

               at
org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)

               at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)

               at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)

               at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)

               at
org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)

               at
org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)

               at
org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)

               at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)

               at
org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)

               at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)

               at
org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)

               at
org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)

               at
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)

               at
org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)

               at java.lang.Thread.run(Thread.java:745)



suspecting that this could be something wrong with SYSTEM table, I went
ahead and dropped all SYSTEM tables from hbase shell and again tried
connecting to phoenix sqlline client. This time connecting through
phoenix-sqlline worked for me. But none of my tables were visible in
Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
hbase tables to phoenix and created them explicitly from phoenix sqlline
client. I first created schema corresponding to namespace and then tables.
This way my tables were visible in phoenix sqlline. *Select Count(*)* query
on my table was returning 8 (expected) records as well but when *select **
query is not returning any record. Can someone tell me what can I do next
in this case?



Thanks,

Tanvi

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by Tanvi Bhandari <ta...@gmail.com>.
HI Jaanai Zhang,

When you say migrate the data, do you mean somehow export the data from
phoenix tables(phoenix 4.6) and bulk-insert into new phoenix
tables(phoenix-4.14) ?
Do you have any data migration script or something which I can take help of
?

Thanks,
Tanvi


On Wed, Oct 17, 2018 at 5:41 PM Jaanai Zhang <cl...@gmail.com> wrote:

> It seems that is impossible to upgrade from Phoenix-4.6 to Phoenix-4.14,
> the schema of SYSTEM  had been changed or some futures will be
> incompatible.  Maybe you can migrate data from Phoenix-4.6 to Phoenix-4.14,
> this solution can ensure that everything will be right.
>
> ----------------------------------------
>    Jaanai Zhang
>    Best regards!
>
>
>
> Tanvi Bhandari <ta...@gmail.com> 于2018年10月17日周三 下午3:48写道:
>
>> @Shamvenk
>>
>> Yes I did check the STATS table from hbase shell, it's not empty.
>>
>> After dropping all SYSTEM tables and mapping hbase-tables to phoenix
>> tables by executing all DDLs, I am seeing new issue.
>>
>> I have a table and an index on that table. Number of records in index
>> table and main table are not matching now.
>> select count(*) from "my_index";
>> select count(COL) from "my_table";-- where COL is not part of index.
>>
>> Can someone tell me what can be done here? Is there any easier way to
>> upgrade from Phoenix-4.6 to Phoenix-4.14?
>>
>>
>>
>> On Thu, Sep 13, 2018 at 8:55 PM venk sham <sh...@gmail.com> wrote:
>>
>>> Did you check system.stats,. If it us empty, needs to be rebuilt by
>>> running major compact on hbasr
>>>
>>> On Tue, Sep 11, 2018, 11:33 AM Tanvi Bhandari <ta...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> I am trying to upgrade the phoenix binaries in my setup from
>>>> phoenix-4.6 (had optional concept of schema) to phoenix-4.14 (schema is a
>>>> must in here).
>>>>
>>>> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run
>>>> the phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I
>>>> try to connect to phoenix using sqline client,  I get the following error
>>>> on *console*:
>>>>
>>>>
>>>>
>>>> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
>>>> IOException
>>>>
>>>> org.apache.hadoop.hbase.DoNotRetryIOException:
>>>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM:CATALOG: 63
>>>>
>>>>         at
>>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3572)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>>
>>>>         at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>>>>
>>>>         at
>>>> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>>>
>>>>         at
>>>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>>>
>>>>         at
>>>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>>>
>>>>        at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>>>
>>>>         ... 10 more
>>>>
>>>>
>>>>
>>>>         at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>
>>>>         at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>>
>>>>         at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>
>>>>         at
>>>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>>
>>>>         at
>>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>>
>>>>         at
>>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:326)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1629)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>>>>
>>>>         at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1271)
>>>>
>>>>         at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1263)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)
>>>>
>>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>>
>>>>         at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>>
>>>>         at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>>
>>>>         at java.lang.Thread.run(Thread.java:745)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Region-server logs are as follows: *
>>>>
>>>> 2018-09-07 03:23:36,170 ERROR
>>>> [B.defaultRpcServer.handler=1,queue=1,port=29062]
>>>> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
>>>> getVersion failed
>>>>
>>>> java.lang.ArrayIndexOutOfBoundsException: 63
>>>>
>>>>                at
>>>> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>>>
>>>>                at
>>>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>>>
>>>>                at
>>>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>>
>>>>                at java.lang.Thread.run(Thread.java:745)
>>>>
>>>>
>>>>
>>>> suspecting that this could be something wrong with SYSTEM table, I went
>>>> ahead and dropped all SYSTEM tables from hbase shell and again tried
>>>> connecting to phoenix sqlline client. This time connecting through
>>>> phoenix-sqlline worked for me. But none of my tables were visible in
>>>> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
>>>> hbase tables to phoenix and created them explicitly from phoenix sqlline
>>>> client. I first created schema corresponding to namespace and then tables.
>>>> This way my tables were visible in phoenix sqlline. *Select Count(*)*
>>>> query on my table was returning 8 (expected) records as well but when *select
>>>> ** query is not returning any record. Can someone tell me what can I
>>>> do next in this case?
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Tanvi
>>>>
>>>>
>>>>
>>>

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by Jaanai Zhang <cl...@gmail.com>.
It seems that is impossible to upgrade from Phoenix-4.6 to Phoenix-4.14,
the schema of SYSTEM  had been changed or some futures will be
incompatible.  Maybe you can migrate data from Phoenix-4.6 to Phoenix-4.14,
this solution can ensure that everything will be right.

----------------------------------------
   Jaanai Zhang
   Best regards!



Tanvi Bhandari <ta...@gmail.com> 于2018年10月17日周三 下午3:48写道:

> @Shamvenk
>
> Yes I did check the STATS table from hbase shell, it's not empty.
>
> After dropping all SYSTEM tables and mapping hbase-tables to phoenix
> tables by executing all DDLs, I am seeing new issue.
>
> I have a table and an index on that table. Number of records in index
> table and main table are not matching now.
> select count(*) from "my_index";
> select count(COL) from "my_table";-- where COL is not part of index.
>
> Can someone tell me what can be done here? Is there any easier way to
> upgrade from Phoenix-4.6 to Phoenix-4.14?
>
>
>
> On Thu, Sep 13, 2018 at 8:55 PM venk sham <sh...@gmail.com> wrote:
>
>> Did you check system.stats,. If it us empty, needs to be rebuilt by
>> running major compact on hbasr
>>
>> On Tue, Sep 11, 2018, 11:33 AM Tanvi Bhandari <ta...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
>>> (had optional concept of schema) to phoenix-4.14 (schema is a must in
>>> here).
>>>
>>> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
>>> phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
>>> to connect to phoenix using sqline client,  I get the following error on
>>> *console*:
>>>
>>>
>>>
>>> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
>>> IOException
>>>
>>> org.apache.hadoop.hbase.DoNotRetryIOException:
>>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM:CATALOG: 63
>>>
>>>         at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
>>>
>>>         at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3572)
>>>
>>>         at
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>
>>>         at
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>>
>>>         at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>>
>>>         at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>>
>>>         at
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>
>>>         at java.lang.Thread.run(Thread.java:745)
>>>
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>>>
>>>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>>
>>>         at
>>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>>
>>>         at
>>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>>
>>>         at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>
>>>         at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>>
>>>        at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>>
>>>         at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>>
>>>         ... 10 more
>>>
>>>
>>>
>>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>>
>>>         at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>
>>>         at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>
>>>         at
>>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>
>>>         at
>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>
>>>         at
>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>
>>>         at
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:326)
>>>
>>>         at
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1629)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
>>>
>>>         at
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
>>>
>>>         at
>>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>>
>>>         at
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>>>
>>>         at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1271)
>>>
>>>         at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1263)
>>>
>>>         at
>>> org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)
>>>
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>
>>>         at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>
>>>         at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>
>>>         at java.lang.Thread.run(Thread.java:745)
>>>
>>>
>>>
>>>
>>>
>>> *Region-server logs are as follows: *
>>>
>>> 2018-09-07 03:23:36,170 ERROR
>>> [B.defaultRpcServer.handler=1,queue=1,port=29062]
>>> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
>>> getVersion failed
>>>
>>> java.lang.ArrayIndexOutOfBoundsException: 63
>>>
>>>                at
>>> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>>
>>>                at
>>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>>
>>>                at
>>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>>
>>>                at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>
>>>                at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>>
>>>                at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>>
>>>                at
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>>
>>>                at
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>
>>>                at
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>>
>>>                at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>>
>>>                at
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>>
>>>                at
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>
>>>                at
>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>
>>>                at
>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>
>>>                at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>
>>>                at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>
>>>                at java.lang.Thread.run(Thread.java:745)
>>>
>>>
>>>
>>> suspecting that this could be something wrong with SYSTEM table, I went
>>> ahead and dropped all SYSTEM tables from hbase shell and again tried
>>> connecting to phoenix sqlline client. This time connecting through
>>> phoenix-sqlline worked for me. But none of my tables were visible in
>>> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
>>> hbase tables to phoenix and created them explicitly from phoenix sqlline
>>> client. I first created schema corresponding to namespace and then tables.
>>> This way my tables were visible in phoenix sqlline. *Select Count(*)*
>>> query on my table was returning 8 (expected) records as well but when *select
>>> ** query is not returning any record. Can someone tell me what can I do
>>> next in this case?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Tanvi
>>>
>>>
>>>
>>

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by Tanvi Bhandari <ta...@gmail.com>.
@Shamvenk

Yes I did check the STATS table from hbase shell, it's not empty.

After dropping all SYSTEM tables and mapping hbase-tables to phoenix tables
by executing all DDLs, I am seeing new issue.

I have a table and an index on that table. Number of records in index table
and main table are not matching now.
select count(*) from "my_index";
select count(COL) from "my_table";-- where COL is not part of index.

Can someone tell me what can be done here? Is there any easier way to
upgrade from Phoenix-4.6 to Phoenix-4.14?



On Thu, Sep 13, 2018 at 8:55 PM venk sham <sh...@gmail.com> wrote:

> Did you check system.stats,. If it us empty, needs to be rebuilt by
> running major compact on hbasr
>
> On Tue, Sep 11, 2018, 11:33 AM Tanvi Bhandari <ta...@gmail.com>
> wrote:
>
>> Hi,
>>
>>
>>
>> I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
>> (had optional concept of schema) to phoenix-4.14 (schema is a must in
>> here).
>>
>> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
>> phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
>> to connect to phoenix using sqline client,  I get the following error on
>> *console*:
>>
>>
>>
>> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
>> IOException
>>
>> org.apache.hadoop.hbase.DoNotRetryIOException:
>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM:CATALOG: 63
>>
>>         at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3572)
>>
>>         at
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>
>>         at
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>
>>         at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>
>>         at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>
>>         at
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>
>>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>
>>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>
>>         at java.lang.Thread.run(Thread.java:745)
>>
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>>
>>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>
>>         at
>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>
>>         at
>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>
>>        at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>
>>         ... 10 more
>>
>>
>>
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>
>>         at
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>
>>         at
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>
>>         at
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:326)
>>
>>         at
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1629)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
>>
>>         at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>
>>         at
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>>
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1271)
>>
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1263)
>>
>>         at org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)
>>
>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>
>>         at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>
>>         at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>
>>         at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>>
>>
>> *Region-server logs are as follows: *
>>
>> 2018-09-07 03:23:36,170 ERROR
>> [B.defaultRpcServer.handler=1,queue=1,port=29062]
>> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
>> getVersion failed
>>
>> java.lang.ArrayIndexOutOfBoundsException: 63
>>
>>                at
>> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>
>>                at
>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>
>>                at
>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>
>>                at
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>
>>                at
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>
>>                at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>
>>                at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>
>>                at
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>
>>                at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>> suspecting that this could be something wrong with SYSTEM table, I went
>> ahead and dropped all SYSTEM tables from hbase shell and again tried
>> connecting to phoenix sqlline client. This time connecting through
>> phoenix-sqlline worked for me. But none of my tables were visible in
>> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
>> hbase tables to phoenix and created them explicitly from phoenix sqlline
>> client. I first created schema corresponding to namespace and then tables.
>> This way my tables were visible in phoenix sqlline. *Select Count(*)*
>> query on my table was returning 8 (expected) records as well but when *select
>> ** query is not returning any record. Can someone tell me what can I do
>> next in this case?
>>
>>
>>
>> Thanks,
>>
>> Tanvi
>>
>>
>>
>

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by venk sham <sh...@gmail.com>.
Did you check system.stats,. If it us empty, needs to be rebuilt by running
major compact on hbasr

On Tue, Sep 11, 2018, 11:33 AM Tanvi Bhandari <ta...@gmail.com>
wrote:

> Hi,
>
>
>
> I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
> (had optional concept of schema) to phoenix-4.14 (schema is a must in
> here).
>
> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
> phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
> to connect to phoenix using sqline client,  I get the following error on
> *console*:
>
>
>
> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
> IOException
>
> org.apache.hadoop.hbase.DoNotRetryIOException:
> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM:CATALOG: 63
>
>         at
> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
>
>         at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3572)
>
>         at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>
>         at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>
>         at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>
>         at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>
>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>
>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>
>         at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>
>         at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>
>         at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>
>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>
>         at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>
>         at
> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>
>         at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>
>         at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>
>        at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>
>         at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>
>         ... 10 more
>
>
>
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>
>         at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>
>         at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>
>         at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:326)
>
>         at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1629)
>
>         at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
>
>         at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
>
>         at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>
>         at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
>
>         at
> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>
>         at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1271)
>
>         at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1263)
>
>         at org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)
>
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
>
> *Region-server logs are as follows: *
>
> 2018-09-07 03:23:36,170 ERROR
> [B.defaultRpcServer.handler=1,queue=1,port=29062]
> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
> getVersion failed
>
> java.lang.ArrayIndexOutOfBoundsException: 63
>
>                at
> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>
>                at
> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>
>                at
> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>
>                at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>
>                at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>
>                at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>
>                at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>
>                at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>
>                at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>
>                at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>
>                at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>
>                at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>
>                at
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>
>                at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>
>                at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>
>                at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>
>                at java.lang.Thread.run(Thread.java:745)
>
>
>
> suspecting that this could be something wrong with SYSTEM table, I went
> ahead and dropped all SYSTEM tables from hbase shell and again tried
> connecting to phoenix sqlline client. This time connecting through
> phoenix-sqlline worked for me. But none of my tables were visible in
> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
> hbase tables to phoenix and created them explicitly from phoenix sqlline
> client. I first created schema corresponding to namespace and then tables.
> This way my tables were visible in phoenix sqlline. *Select Count(*)*
> query on my table was returning 8 (expected) records as well but when *select
> ** query is not returning any record. Can someone tell me what can I do
> next in this case?
>
>
>
> Thanks,
>
> Tanvi
>
>
>

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by Tanvi Bhandari <ta...@gmail.com>.
I think i found the issue :

I had the tables created in Phoenix 4.6 which did not have Column name
encoding feature(https://issues.apache.org/jira/browse/PHOENIX-1598). And
then now I have to move to phoenix 4.14 directly.
As far as I know Phoenix handles the upgrade for Column names encoding
using SYSTEM tables.Since, connecting to phoenix 4.14 was failing with
ArrayIndexOutOfBoundsException for SYSTEM:CATALOG: issue, I went ahead and
deleted all SYSTEM tables in from hbase shell. And then reconnecting from
sqlline re-created all Phoenix SYSTEM tables. Then I used my tables DDL to
recreate the Phoenix tables on existing hbase tables. Hbase was still
showing all column names of my table as it is (in english) .

*select count(*)* query was returning correct number of records because it
should be calculating it using row-keys, but *select ** query is not able
to map the column names since they are not encoded in Hbase table.

*Solution *:
I disabled the column name encoding by setting
*phoenix.default.column.encoded.bytes.attrib=0* in my hbase-site.xml
(globally). Although, I can set it at table create statement as well by
setting *COLUMN_ENCODED_BYTES = 0* in table create-statement.

*Steps for upgrade from phoenix-4.6 to phoenix-4.14 :*
1) After upgrade to phoenix 4.14, set
"phoenix.default.column.encoded.bytes.attrib=0" at hbase-site.xml in hbase
(if you want to disable column name encoding globally) or set it at table
level in DDL.
2) Delete all SYSTEM tables from hbase shell.
3) Again connect through phoenix-4.14 sqlline which will recreate all
SYSTEM tables.
4) Map all your data tables from hbase to phoenix by executing all DDLs
(add "COLUMN_ENCODED_BYTES = 0" in create-statement if you do not want to
disable column name encoding globally).

P.S. If you have immutable tables you might want to handle that as well
while disable column-name encoding.(
https://phoenix.apache.org/columnencoding.html). Got all data through
phoenix as well.

Thanks,
Tanvi

On Thu, Sep 13, 2018 at 2:06 AM Thomas D'Silva <td...@salesforce.com>
wrote:

> can you attach the schema of your table? and the explain plan for select *
> from mytable?
>
> On Tue, Sep 11, 2018 at 10:24 PM, Tanvi Bhandari <tanvi.bhandari@gmail.com
> > wrote:
>
>> " mapped hbase tables to phoenix and created them explicitly from
>> phoenix sqlline client. I first created schema corresponding to namespace
>> and then tables." By this statement, I meant the same. I re-created my
>> tables since I had the DDLs with me.
>>
>> After that I tried getting the count of records in my table which gave me
>> 8 records (expected result). - *select count(*) from "myTable"*;
>> But when I performed the *select * from "myTable";* it is not returning
>> any result.
>>
>> On Wed, Sep 12, 2018 at 1:55 AM Thomas D'Silva <td...@salesforce.com>
>> wrote:
>>
>>> Since you dropped all the system tables, all the phoenix metadata was
>>> lost. If you have the ddl statements used to create your tables, you can
>>> try rerunning them.
>>>
>>> On Tue, Sep 11, 2018 at 9:32 AM, Tanvi Bhandari <
>>> tanvi.bhandari@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>>
>>>>
>>>> I am trying to upgrade the phoenix binaries in my setup from
>>>> phoenix-4.6 (had optional concept of schema) to phoenix-4.14 (schema is a
>>>> must in here).
>>>>
>>>> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run
>>>> the phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I
>>>> try to connect to phoenix using sqline client,  I get the following error
>>>> on *console*:
>>>>
>>>>
>>>>
>>>> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
>>>> IOException
>>>>
>>>> org.apache.hadoop.hbase.DoNotRetryIOException:
>>>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM:CATALOG: 63
>>>>
>>>>         at
>>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3572)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>>
>>>>         at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>>>>
>>>>         at
>>>> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>>>
>>>>         at
>>>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>>>
>>>>         at
>>>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>>>
>>>>        at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>>>
>>>>         ... 10 more
>>>>
>>>>
>>>>
>>>>         at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>
>>>>         at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>>
>>>>         at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>>
>>>>         at
>>>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>>>
>>>>         at
>>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>>
>>>>         at
>>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:326)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1629)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>>>
>>>>         at
>>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>>>>
>>>>         at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1271)
>>>>
>>>>         at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1263)
>>>>
>>>>         at
>>>> org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)
>>>>
>>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>>
>>>>         at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>>
>>>>         at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>>
>>>>         at java.lang.Thread.run(Thread.java:745)
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Region-server logs are as follows: *
>>>>
>>>> 2018-09-07 03:23:36,170 ERROR
>>>> [B.defaultRpcServer.handler=1,queue=1,port=29062]
>>>> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
>>>> getVersion failed
>>>>
>>>> java.lang.ArrayIndexOutOfBoundsException: 63
>>>>
>>>>                at
>>>> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>>>
>>>>                at
>>>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>>>
>>>>                at
>>>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>>>
>>>>                at
>>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>>
>>>>                at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>>
>>>>                at java.lang.Thread.run(Thread.java:745)
>>>>
>>>>
>>>>
>>>> suspecting that this could be something wrong with SYSTEM table, I went
>>>> ahead and dropped all SYSTEM tables from hbase shell and again tried
>>>> connecting to phoenix sqlline client. This time connecting through
>>>> phoenix-sqlline worked for me. But none of my tables were visible in
>>>> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
>>>> hbase tables to phoenix and created them explicitly from phoenix sqlline
>>>> client. I first created schema corresponding to namespace and then tables.
>>>> This way my tables were visible in phoenix sqlline. *Select Count(*)*
>>>> query on my table was returning 8 (expected) records as well but when *select
>>>> ** query is not returning any record. Can someone tell me what can I
>>>> do next in this case?
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Tanvi
>>>>
>>>>
>>>>
>>>
>>>
>

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by Thomas D'Silva <td...@salesforce.com>.
can you attach the schema of your table? and the explain plan for select *
from mytable?

On Tue, Sep 11, 2018 at 10:24 PM, Tanvi Bhandari <ta...@gmail.com>
wrote:

> " mapped hbase tables to phoenix and created them explicitly from phoenix
> sqlline client. I first created schema corresponding to namespace and then
> tables." By this statement, I meant the same. I re-created my tables
> since I had the DDLs with me.
>
> After that I tried getting the count of records in my table which gave me
> 8 records (expected result). - *select count(*) from "myTable"*;
> But when I performed the *select * from "myTable";* it is not returning
> any result.
>
> On Wed, Sep 12, 2018 at 1:55 AM Thomas D'Silva <td...@salesforce.com>
> wrote:
>
>> Since you dropped all the system tables, all the phoenix metadata was
>> lost. If you have the ddl statements used to create your tables, you can
>> try rerunning them.
>>
>> On Tue, Sep 11, 2018 at 9:32 AM, Tanvi Bhandari <tanvi.bhandari@gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>>
>>>
>>> I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
>>> (had optional concept of schema) to phoenix-4.14 (schema is a must in
>>> here).
>>>
>>> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
>>> phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
>>> to connect to phoenix using sqline client,  I get the following error on
>>> *console*:
>>>
>>>
>>>
>>> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
>>> IOException
>>>
>>> org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException:
>>> SYSTEM:CATALOG: 63
>>>
>>>         at org.apache.phoenix.util.ServerUtil.createIOException(
>>> ServerUtil.java:120)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> getVersion(MetaDataEndpointImpl.java:3572)
>>>
>>>         at org.apache.phoenix.coprocessor.generated.MetaDataProtos$
>>> MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>
>>>         at org.apache.hadoop.hbase.regionserver.HRegion.
>>> execService(HRegion.java:7435)
>>>
>>>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execServiceOnRegion(RSRpcServices.java:1875)
>>>
>>>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execService(RSRpcServices.java:1857)
>>>
>>>         at org.apache.hadoop.hbase.protobuf.generated.
>>> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:
>>> 2114)
>>>
>>>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.
>>> java:101)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
>>> RpcExecutor.java:130)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.
>>> java:107)
>>>
>>>         at java.lang.Thread.run(Thread.java:745)
>>>
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>>>
>>>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.
>>> java:517)
>>>
>>>         at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.
>>> java:421)
>>>
>>>         at org.apache.phoenix.schema.PTableImpl.makePTable(
>>> PTableImpl.java:406)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(
>>> MetaDataEndpointImpl.java:1046)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> buildTable(MetaDataEndpointImpl.java:587)
>>>
>>>        at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(
>>> MetaDataEndpointImpl.java:1305)
>>>
>>>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> getVersion(MetaDataEndpointImpl.java:3568)
>>>
>>>         ... 10 more
>>>
>>>
>>>
>>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>>
>>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
>>> NativeConstructorAccessorImpl.java:62)
>>>
>>>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
>>> DelegatingConstructorAccessorImpl.java:45)
>>>
>>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:
>>> 423)
>>>
>>>         at org.apache.hadoop.ipc.RemoteException.instantiateException(
>>> RemoteException.java:106)
>>>
>>>         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(
>>> RemoteException.java:95)
>>>
>>>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
>>> getRemoteException(ProtobufUtil.java:326)
>>>
>>>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
>>> execService(ProtobufUtil.java:1629)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.
>>> call(RegionCoprocessorRpcChannel.java:104)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.
>>> call(RegionCoprocessorRpcChannel.java:94)
>>>
>>>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.
>>> callWithRetries(RpcRetryingCaller.java:136)
>>>
>>>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.
>>> callExecService(RegionCoprocessorRpcChannel.java:107)
>>>
>>>         at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(
>>> CoprocessorRpcChannel.java:56)
>>>
>>>         at org.apache.phoenix.coprocessor.generated.MetaDataProtos$
>>> MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>>>
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(
>>> ConnectionQueryServicesImpl.java:1271)
>>>
>>>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(
>>> ConnectionQueryServicesImpl.java:1263)
>>>
>>>         at org.apache.hadoop.hbase.client.HTable$15.call(HTable.
>>> java:1736)
>>>
>>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>
>>>         at java.util.concurrent.ThreadPoolExecutor.runWorker(
>>> ThreadPoolExecutor.java:1142)
>>>
>>>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
>>> ThreadPoolExecutor.java:617)
>>>
>>>         at java.lang.Thread.run(Thread.java:745)
>>>
>>>
>>>
>>>
>>>
>>> *Region-server logs are as follows: *
>>>
>>> 2018-09-07 03:23:36,170 ERROR [B.defaultRpcServer.handler=1,queue=1,port=29062]
>>> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
>>> getVersion failed
>>>
>>> java.lang.ArrayIndexOutOfBoundsException: 63
>>>
>>>                at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.
>>> java:517)
>>>
>>>                at org.apache.phoenix.schema.
>>> PTableImpl.<init>(PTableImpl.java:421)
>>>
>>>                at org.apache.phoenix.schema.PTableImpl.makePTable(
>>> PTableImpl.java:406)
>>>
>>>                at org.apache.phoenix.coprocessor.
>>> MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>>
>>>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> buildTable(MetaDataEndpointImpl.java:587)
>>>
>>>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> loadTable(MetaDataEndpointImpl.java:1305)
>>>
>>>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
>>> getVersion(MetaDataEndpointImpl.java:3568)
>>>
>>>                at org.apache.phoenix.coprocessor.generated.
>>> MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>>
>>>                at org.apache.hadoop.hbase.regionserver.HRegion.
>>> execService(HRegion.java:7435)
>>>
>>>                at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execServiceOnRegion(RSRpcServices.java:1875)
>>>
>>>                at org.apache.hadoop.hbase.regionserver.RSRpcServices.
>>> execService(RSRpcServices.java:1857)
>>>
>>>                at org.apache.hadoop.hbase.protobuf.generated.
>>> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>>
>>>                at org.apache.hadoop.hbase.ipc.
>>> RpcServer.call(RpcServer.java:2114)
>>>
>>>                at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.
>>> java:101)
>>>
>>>                at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
>>> RpcExecutor.java:130)
>>>
>>>                at org.apache.hadoop.hbase.ipc.
>>> RpcExecutor$1.run(RpcExecutor.java:107)
>>>
>>>                at java.lang.Thread.run(Thread.java:745)
>>>
>>>
>>>
>>> suspecting that this could be something wrong with SYSTEM table, I went
>>> ahead and dropped all SYSTEM tables from hbase shell and again tried
>>> connecting to phoenix sqlline client. This time connecting through
>>> phoenix-sqlline worked for me. But none of my tables were visible in
>>> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
>>> hbase tables to phoenix and created them explicitly from phoenix sqlline
>>> client. I first created schema corresponding to namespace and then tables.
>>> This way my tables were visible in phoenix sqlline. *Select Count(*)*
>>> query on my table was returning 8 (expected) records as well but when *select
>>> ** query is not returning any record. Can someone tell me what can I do
>>> next in this case?
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Tanvi
>>>
>>>
>>>
>>
>>

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by Tanvi Bhandari <ta...@gmail.com>.
" mapped hbase tables to phoenix and created them explicitly from phoenix
sqlline client. I first created schema corresponding to namespace and then
tables." By this statement, I meant the same. I re-created my tables since
I had the DDLs with me.

After that I tried getting the count of records in my table which gave me 8
records (expected result). - *select count(*) from "myTable"*;
But when I performed the *select * from "myTable";* it is not returning any
result.

On Wed, Sep 12, 2018 at 1:55 AM Thomas D'Silva <td...@salesforce.com>
wrote:

> Since you dropped all the system tables, all the phoenix metadata was
> lost. If you have the ddl statements used to create your tables, you can
> try rerunning them.
>
> On Tue, Sep 11, 2018 at 9:32 AM, Tanvi Bhandari <ta...@gmail.com>
> wrote:
>
>> Hi,
>>
>>
>>
>> I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
>> (had optional concept of schema) to phoenix-4.14 (schema is a must in
>> here).
>>
>> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
>> phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
>> to connect to phoenix using sqline client,  I get the following error on
>> *console*:
>>
>>
>>
>> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
>> IOException
>>
>> org.apache.hadoop.hbase.DoNotRetryIOException:
>> org.apache.hadoop.hbase.DoNotRetryIOException: SYSTEM:CATALOG: 63
>>
>>         at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:120)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3572)
>>
>>         at
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>
>>         at
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>
>>         at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>
>>         at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>
>>         at
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>
>>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>
>>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>
>>         at java.lang.Thread.run(Thread.java:745)
>>
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>>
>>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>
>>         at
>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>
>>         at
>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>
>>        at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>
>>         at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>
>>         ... 10 more
>>
>>
>>
>>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>
>>         at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>
>>         at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>
>>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>>
>>         at
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>
>>         at
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>
>>         at
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:326)
>>
>>         at
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1629)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
>>
>>         at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
>>
>>         at
>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>
>>         at
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>>
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1271)
>>
>>         at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(ConnectionQueryServicesImpl.java:1263)
>>
>>         at org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)
>>
>>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>
>>         at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>
>>         at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>
>>         at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>>
>>
>> *Region-server logs are as follows: *
>>
>> 2018-09-07 03:23:36,170 ERROR
>> [B.defaultRpcServer.handler=1,queue=1,port=29062]
>> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
>> getVersion failed
>>
>> java.lang.ArrayIndexOutOfBoundsException: 63
>>
>>                at
>> org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>>
>>                at
>> org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:421)
>>
>>                at
>> org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:406)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:587)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(MetaDataEndpointImpl.java:1305)
>>
>>                at
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(MetaDataEndpointImpl.java:3568)
>>
>>                at
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16422)
>>
>>                at
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>
>>                at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1875)
>>
>>                at
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>
>>                at
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>
>>                at
>> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>
>>                at java.lang.Thread.run(Thread.java:745)
>>
>>
>>
>> suspecting that this could be something wrong with SYSTEM table, I went
>> ahead and dropped all SYSTEM tables from hbase shell and again tried
>> connecting to phoenix sqlline client. This time connecting through
>> phoenix-sqlline worked for me. But none of my tables were visible in
>> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
>> hbase tables to phoenix and created them explicitly from phoenix sqlline
>> client. I first created schema corresponding to namespace and then tables.
>> This way my tables were visible in phoenix sqlline. *Select Count(*)*
>> query on my table was returning 8 (expected) records as well but when *select
>> ** query is not returning any record. Can someone tell me what can I do
>> next in this case?
>>
>>
>>
>> Thanks,
>>
>> Tanvi
>>
>>
>>
>
>

Re: Issue in upgrading phoenix : java.lang.ArrayIndexOutOfBoundsException: SYSTEM:CATALOG 63

Posted by Thomas D'Silva <td...@salesforce.com>.
Since you dropped all the system tables, all the phoenix metadata was lost.
If you have the ddl statements used to create your tables, you can try
rerunning them.

On Tue, Sep 11, 2018 at 9:32 AM, Tanvi Bhandari <ta...@gmail.com>
wrote:

> Hi,
>
>
>
> I am trying to upgrade the phoenix binaries in my setup from phoenix-4.6
> (had optional concept of schema) to phoenix-4.14 (schema is a must in
> here).
>
> Earlier, I had the phoenix-4.6-hbase-1.1 binaries. When I try to run the
> phoenix-4.14-hbase-1.3 on the same data. Hbase comes up fine But when I try
> to connect to phoenix using sqline client,  I get the following error on
> *console*:
>
>
>
> 18/09/07 04:22:48 WARN ipc.CoprocessorRpcChannel: Call failed on
> IOException
>
> org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException:
> SYSTEM:CATALOG: 63
>
>         at org.apache.phoenix.util.ServerUtil.createIOException(
> ServerUtil.java:120)
>
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(
> MetaDataEndpointImpl.java:3572)
>
>         at org.apache.phoenix.coprocessor.generated.MetaDataProtos$
> MetaDataService.callMethod(MetaDataProtos.java:16422)
>
>         at org.apache.hadoop.hbase.regionserver.HRegion.
> execService(HRegion.java:7435)
>
>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.
> execServiceOnRegion(RSRpcServices.java:1875)
>
>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.
> execService(RSRpcServices.java:1857)
>
>         at org.apache.hadoop.hbase.protobuf.generated.
> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>
>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>
>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>
>         at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
> RpcExecutor.java:130)
>
>         at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.
> java:107)
>
>         at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 63
>
>         at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:517)
>
>         at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.
> java:421)
>
>         at org.apache.phoenix.schema.PTableImpl.makePTable(
> PTableImpl.java:406)
>
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(
> MetaDataEndpointImpl.java:1046)
>
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(
> MetaDataEndpointImpl.java:587)
>
>        at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.loadTable(
> MetaDataEndpointImpl.java:1305)
>
>         at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getVersion(
> MetaDataEndpointImpl.java:3568)
>
>         ... 10 more
>
>
>
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>
>         at org.apache.hadoop.ipc.RemoteException.instantiateException(
> RemoteException.java:106)
>
>         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(
> RemoteException.java:95)
>
>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
> getRemoteException(ProtobufUtil.java:326)
>
>         at org.apache.hadoop.hbase.protobuf.ProtobufUtil.
> execService(ProtobufUtil.java:1629)
>
>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(
> RegionCoprocessorRpcChannel.java:104)
>
>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(
> RegionCoprocessorRpcChannel.java:94)
>
>         at org.apache.hadoop.hbase.client.RpcRetryingCaller.
> callWithRetries(RpcRetryingCaller.java:136)
>
>         at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.
> callExecService(RegionCoprocessorRpcChannel.java:107)
>
>         at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(
> CoprocessorRpcChannel.java:56)
>
>         at org.apache.phoenix.coprocessor.generated.MetaDataProtos$
> MetaDataService$Stub.getVersion(MetaDataProtos.java:16739)
>
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(
> ConnectionQueryServicesImpl.java:1271)
>
>         at org.apache.phoenix.query.ConnectionQueryServicesImpl$5.call(
> ConnectionQueryServicesImpl.java:1263)
>
>         at org.apache.hadoop.hbase.client.HTable$15.call(HTable.java:1736)
>
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>
>         at java.lang.Thread.run(Thread.java:745)
>
>
>
>
>
> *Region-server logs are as follows: *
>
> 2018-09-07 03:23:36,170 ERROR [B.defaultRpcServer.handler=1,queue=1,port=29062]
> coprocessor.MetaDataEndpointImpl: loading system catalog table inside
> getVersion failed
>
> java.lang.ArrayIndexOutOfBoundsException: 63
>
>                at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.
> java:517)
>
>                at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.
> java:421)
>
>                at org.apache.phoenix.schema.PTableImpl.makePTable(
> PTableImpl.java:406)
>
>                at org.apache.phoenix.coprocessor.
> MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:1046)
>
>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
> buildTable(MetaDataEndpointImpl.java:587)
>
>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
> loadTable(MetaDataEndpointImpl.java:1305)
>
>                at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.
> getVersion(MetaDataEndpointImpl.java:3568)
>
>                at org.apache.phoenix.coprocessor.generated.MetaDataProtos$
> MetaDataService.callMethod(MetaDataProtos.java:16422)
>
>                at org.apache.hadoop.hbase.regionserver.HRegion.
> execService(HRegion.java:7435)
>
>                at org.apache.hadoop.hbase.regionserver.RSRpcServices.
> execServiceOnRegion(RSRpcServices.java:1875)
>
>                at org.apache.hadoop.hbase.regionserver.RSRpcServices.
> execService(RSRpcServices.java:1857)
>
>                at org.apache.hadoop.hbase.protobuf.generated.
> ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32209)
>
>                at org.apache.hadoop.hbase.ipc.
> RpcServer.call(RpcServer.java:2114)
>
>                at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.
> java:101)
>
>                at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
> RpcExecutor.java:130)
>
>                at org.apache.hadoop.hbase.ipc.
> RpcExecutor$1.run(RpcExecutor.java:107)
>
>                at java.lang.Thread.run(Thread.java:745)
>
>
>
> suspecting that this could be something wrong with SYSTEM table, I went
> ahead and dropped all SYSTEM tables from hbase shell and again tried
> connecting to phoenix sqlline client. This time connecting through
> phoenix-sqlline worked for me. But none of my tables were visible in
> Phoenix shell only SYSTEM tables were visible.  So I went ahead and mapped
> hbase tables to phoenix and created them explicitly from phoenix sqlline
> client. I first created schema corresponding to namespace and then tables.
> This way my tables were visible in phoenix sqlline. *Select Count(*)*
> query on my table was returning 8 (expected) records as well but when *select
> ** query is not returning any record. Can someone tell me what can I do
> next in this case?
>
>
>
> Thanks,
>
> Tanvi
>
>
>