You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by "Saurabh Agarwal (BLOOMBERG/ 731 LEX)" <sa...@bloomberg.net> on 2016/03/12 01:25:39 UTC

Re: Phoenix table is inaccessible...

Thanks. I will try that. Questions? I am able to access other tables fine. If SYSTEM.CATALOG got corrupted, wouldn't it impact all tables? 

Also how to restore SYSTEM.CATALOG table without restarting sqlline? 


Sent from Bloomberg Professional for iPhone 

----- Original Message -----
From: Sergey Soldatov <se...@gmail.com>
To: SAURABH AGARWAL, user@phoenix.apache.org
CC: ANIRUDHA JADHAV
At: 11-Mar-2016 19:07:31


Hi Saurabh,
It seems that your SYSTEM.CATALOG got corrupted somehow. Usually you
need to disable and drop 'SYSTEM.CATALOG' in hbase shell. After that
restart sqlline (it will automatically recreate system catalog)  and
recreate all user tables. The table data usually is not affected, but
just in case make a  backup of your hbase before.

Possible someone has a better advice.

Thanks,
Sergey

On Fri, Mar 11, 2016 at 3:05 PM, Saurabh Agarwal (BLOOMBERG/ 731 LEX)
<sa...@bloomberg.net> wrote:
> Hi,
>
> I had been experimenting with different indexes on Phoenix table to get the
> desired performance.
>
> After creating secondary index that create index on one column and include
> rest of the fields, it start throwing the following exceptions whenever I
> access the table.
>
> Can you point me what might be went wrong here?
>
> We are using HDP 2.3 - HBase 1.1.2.2.3.2.0-2950, phoenix-4.4.0.2.3.2.0-2950
>
> 0: jdbc:phoenix:> select count(*) from "Weather";
> 16/03/11 17:37:32 WARN ipc.CoprocessorRpcChannel: Call failed on IOException
> org.apache.hadoop.hbase.DoNotRetryIOException:
> org.apache.hadoop.hbase.DoNotRetryIOException: com.bloomb
> erg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
> ava:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
> at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
> va:92)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
> va:89)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
> hannel.java:95)
> at
> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
> aProtos.java:10665)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
> 1292)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
> 1279)
> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
> com.bloomberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
> thod(AbstractRpcClient.java:287)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
> e(ClientProtos.java:32675)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
> ... 13 more
> 16/03/11 17:37:32 WARN client.HTable: Error calling coprocessor service
> org.apache.phoenix.coprocessor.g enerated.MetaDataProtos$MetaDataService for
> row \x00\x00com.bloomberg.ds.WeatherSmallSalt
> java.util.concurrent.ExecutionException:
> org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoo
> p.hbase.DoNotRetryIOException: com.bloomberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> at java.util.concurrent.FutureTask.get(FutureTask.java:192)
> at
> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1763)
> at
> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1719)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
> ervicesImpl.java:1026)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
> ervicesImpl.java:1006)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.jav
> a:1278)
> at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:415)
> at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:358)
> at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:354)
> at
> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:4
> 13)
> at
> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:28
> 8)
> at
> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:189)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
> nt.java:358)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
> nt.java:339)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:247)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:242)
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:241)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1257)
> at sqlline.Commands.execute(Commands.java:822)
> at sqlline.Commands.sql(Commands.java:732)
> at sqlline.SqlLine.dispatch(SqlLine.java:808)
> at sqlline.SqlLine.begin(SqlLine.java:681)
> at sqlline.SqlLine.start(SqlLine.java:398)
> at sqlline.SqlLine.main(SqlLine.java:292)
> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
> org.apache.hadoop.hbase.DoNotRetryIOException:
> com.bloomberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
> ava:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
> at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
> va:92)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
> va:89)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
> hannel.java:95)
> at
> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
> aProtos.java:10665)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
> 1292)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
> 1279)
> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
> com.bloomberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
> thod(AbstractRpcClient.java:287)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
> e(ClientProtos.java:32675)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
> ... 13 more
> Error: org.apache.hadoop.hbase.DoNotRetryIOException:
> com.bloomberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more (state=08000,code=101)
> org.apache.phoenix.exception.PhoenixIOException:
> org.apache.hadoop.hbase.DoNotRetryIOException: com.bloo
> mberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at
> org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
> ervicesImpl.java:1043)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
> ervicesImpl.java:1006)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.jav
> a:1278)
> at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:415)
> at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:358)
> at
> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:354)
> at
> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:4
> 13)
> at
> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:28
> 8)
> at
> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:189)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
> nt.java:358)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
> nt.java:339)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:247)
> at
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:242)
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:241)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1257)
> at sqlline.Commands.execute(Commands.java:822)
> at sqlline.Commands.sql(Commands.java:732)
> at sqlline.SqlLine.dispatch(SqlLine.java:808)
> at sqlline.SqlLine.begin(SqlLine.java:681)
> at sqlline.SqlLine.start(SqlLine.java:398)
> at sqlline.SqlLine.main(SqlLine.java:292)
> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
> org.apache.hadoop.hbase.DoNotRetryIOException:
> com.bloomberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
> ava:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
> at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
> va:92)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
> va:89)
> at
> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
> at
> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
> hannel.java:95)
> at
> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
> aProtos.java:10665)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
> 1292)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
> 1279)
> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
> com.bloomberg.ds.WeatherSmallSalt: 35
> at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
> at
> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
> otos.java:10505)
> at
> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
> 5)
> at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
> ientProtos.java:32209)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
> at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
> :526)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
> )
> at
> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
> ... 10 more
>
> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
> at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
> thod(AbstractRpcClient.java:287)
> at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
> e(ClientProtos.java:32675)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
> ... 13 more
>

Re: Phoenix table is inaccessible...

Posted by Jonathan Leech <jo...@gmail.com>.
Seen these kinds of errors when the regions for the 2ndary index end up on a different region server than the main table. Make sure the configuration is all correct and also look for regions stuck in transition, etc. try bouncing hbase, then drop all secondary indexes on the table as well as as its hbase table if necessary. 


> On Mar 11, 2016, at 6:29 PM, Sergey Soldatov <se...@gmail.com> wrote:
> 
> The system information about all Phoenix tables is located in HBase
> SYSTEM.CATALOG table. So, if you recreate the catalog you will need to
> recreate all tables as well. I'm not sure is there any other way to
> fix it.
> 
> On Fri, Mar 11, 2016 at 4:25 PM, Saurabh Agarwal (BLOOMBERG/ 731 LEX)
> <sa...@bloomberg.net> wrote:
>> Thanks. I will try that. Questions? I am able to access other tables fine.
>> If SYSTEM.CATALOG got corrupted, wouldn't it impact all tables?
>> 
>> Also how to restore SYSTEM.CATALOG table without restarting sqlline?
>> 
>> 
>> Sent from Bloomberg Professional for iPhone
>> 
>> 
>> ----- Original Message -----
>> From: Sergey Soldatov <se...@gmail.com>
>> To: SAURABH AGARWAL, user@phoenix.apache.org
>> CC: ANIRUDHA JADHAV
>> At: 11-Mar-2016 19:07:31
>> 
>> Hi Saurabh,
>> It seems that your SYSTEM.CATALOG got corrupted somehow. Usually you
>> need to disable and drop 'SYSTEM.CATALOG' in hbase shell. After that
>> restart sqlline (it will automatically recreate system catalog) and
>> recreate all user tables. The table data usually is not affected, but
>> just in case make a backup of your hbase before.
>> 
>> Possible someone has a better advice.
>> 
>> Thanks,
>> Sergey
>> 
>> On Fri, Mar 11, 2016 at 3:05 PM, Saurabh Agarwal (BLOOMBERG/ 731 LEX)
>> <sa...@bloomberg.net> wrote:
>>> Hi,
>>> 
>>> I had been experimenting with different indexes on Phoenix table to get
>>> the
>>> desired performance.
>>> 
>>> After creating secondary index that create index on one column and include
>>> rest of the fields, it start throwing the following exceptions whenever I
>>> access the table.
>>> 
>>> Can you point me what might be went wrong here?
>>> 
>>> We are using HDP 2.3 - HBase 1.1.2.2.3.2.0-2950,
>>> phoenix-4.4.0.2.3.2.0-2950
>>> 
>>> 0: jdbc:phoenix:> select count(*) from "Weather";
>>> 16/03/11 17:37:32 WARN ipc.CoprocessorRpcChannel: Call failed on
>>> IOException
>>> org.apache.hadoop.hbase.DoNotRetryIOException:
>>> org.apache.hadoop.hbase.DoNotRetryIOException: com.bloomb
>>> erg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> 
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> at
>>> 
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
>>> ava:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> at
>>> 
>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>> at
>>> 
>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>>> va:92)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>>> va:89)
>>> at
>>> 
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
>>> hannel.java:95)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
>>> aProtos.java:10665)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>>> 1292)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>>> 1279)
>>> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> at
>>> 
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at
>>> 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by:
>>> 
>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
>>> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
>>> com.bloomberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
>>> thod(AbstractRpcClient.java:287)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
>>> e(ClientProtos.java:32675)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
>>> ... 13 more
>>> 16/03/11 17:37:32 WARN client.HTable: Error calling coprocessor service
>>> org.apache.phoenix.coprocessor.g enerated.MetaDataProtos$MetaDataService
>>> for
>>> row \x00\x00com.bloomberg.ds.WeatherSmallSalt
>>> java.util.concurrent.ExecutionException:
>>> org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoo
>>> p.hbase.DoNotRetryIOException: com.bloomberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at java.util.concurrent.FutureTask.report(FutureTask.java:122)
>>> at java.util.concurrent.FutureTask.get(FutureTask.java:192)
>>> at
>>> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1763)
>>> at
>>> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1719)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>>> ervicesImpl.java:1026)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>>> ervicesImpl.java:1006)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.jav
>>> a:1278)
>>> at
>>> 
>>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:415)
>>> at
>>> 
>>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:358)
>>> at
>>> 
>>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:354)
>>> at
>>> 
>>> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:4
>>> 13)
>>> at
>>> 
>>> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:28
>>> 8)
>>> at
>>> 
>>> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:189)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>>> nt.java:358)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>>> nt.java:339)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:247)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:242)
>>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:241)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1257)
>>> at sqlline.Commands.execute(Commands.java:822)
>>> at sqlline.Commands.sql(Commands.java:732)
>>> at sqlline.SqlLine.dispatch(SqlLine.java:808)
>>> at sqlline.SqlLine.begin(SqlLine.java:681)
>>> at sqlline.SqlLine.start(SqlLine.java:398)
>>> at sqlline.SqlLine.main(SqlLine.java:292)
>>> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
>>> org.apache.hadoop.hbase.DoNotRetryIOException:
>>> com.bloomberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> 
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> at
>>> 
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
>>> ava:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> at
>>> 
>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>> at
>>> 
>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>>> va:92)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>>> va:89)
>>> at
>>> 
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
>>> hannel.java:95)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
>>> aProtos.java:10665)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>>> 1292)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>>> 1279)
>>> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> at
>>> 
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at
>>> 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by:
>>> 
>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
>>> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
>>> com.bloomberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
>>> thod(AbstractRpcClient.java:287)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
>>> e(ClientProtos.java:32675)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
>>> ... 13 more
>>> Error: org.apache.hadoop.hbase.DoNotRetryIOException:
>>> com.bloomberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more (state=08000,code=101)
>>> org.apache.phoenix.exception.PhoenixIOException:
>>> org.apache.hadoop.hbase.DoNotRetryIOException: com.bloo
>>> mberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at
>>> 
>>> org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>>> ervicesImpl.java:1043)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>>> ervicesImpl.java:1006)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.jav
>>> a:1278)
>>> at
>>> 
>>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:415)
>>> at
>>> 
>>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:358)
>>> at
>>> 
>>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:354)
>>> at
>>> 
>>> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:4
>>> 13)
>>> at
>>> 
>>> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:28
>>> 8)
>>> at
>>> 
>>> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:189)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>>> nt.java:358)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>>> nt.java:339)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:247)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:242)
>>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:241)
>>> at
>>> 
>>> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1257)
>>> at sqlline.Commands.execute(Commands.java:822)
>>> at sqlline.Commands.sql(Commands.java:732)
>>> at sqlline.SqlLine.dispatch(SqlLine.java:808)
>>> at sqlline.SqlLine.begin(SqlLine.java:681)
>>> at sqlline.SqlLine.start(SqlLine.java:398)
>>> at sqlline.SqlLine.main(SqlLine.java:292)
>>> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
>>> org.apache.hadoop.hbase.DoNotRetryIOException:
>>> com.bloomberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> 
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>> at
>>> 
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
>>> ava:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>> at
>>> 
>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>> at
>>> 
>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>>> va:92)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>>> va:89)
>>> at
>>> 
>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
>>> hannel.java:95)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
>>> aProtos.java:10665)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>>> 1292)
>>> at
>>> 
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>>> 1279)
>>> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> at
>>> 
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at
>>> 
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by:
>>> 
>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
>>> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
>>> com.bloomberg.ds.WeatherSmallSalt: 35
>>> at
>>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>>> otos.java:10505)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>>> 5)
>>> at
>>> 
>>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>>> ientProtos.java:32209)
>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>> at
>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>>> :526)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>>> )
>>> at
>>> 
>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>>> ... 10 more
>>> 
>>> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>>> at
>>> 
>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
>>> thod(AbstractRpcClient.java:287)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
>>> e(ClientProtos.java:32675)
>>> at
>>> 
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
>>> ... 13 more
> 


Re: Phoenix table is inaccessible...

Posted by Sergey Soldatov <se...@gmail.com>.
The system information about all Phoenix tables is located in HBase
SYSTEM.CATALOG table. So, if you recreate the catalog you will need to
recreate all tables as well. I'm not sure is there any other way to
fix it.

On Fri, Mar 11, 2016 at 4:25 PM, Saurabh Agarwal (BLOOMBERG/ 731 LEX)
<sa...@bloomberg.net> wrote:
> Thanks. I will try that. Questions? I am able to access other tables fine.
> If SYSTEM.CATALOG got corrupted, wouldn't it impact all tables?
>
> Also how to restore SYSTEM.CATALOG table without restarting sqlline?
>
>
> Sent from Bloomberg Professional for iPhone
>
>
> ----- Original Message -----
> From: Sergey Soldatov <se...@gmail.com>
> To: SAURABH AGARWAL, user@phoenix.apache.org
> CC: ANIRUDHA JADHAV
> At: 11-Mar-2016 19:07:31
>
> Hi Saurabh,
> It seems that your SYSTEM.CATALOG got corrupted somehow. Usually you
> need to disable and drop 'SYSTEM.CATALOG' in hbase shell. After that
> restart sqlline (it will automatically recreate system catalog) and
> recreate all user tables. The table data usually is not affected, but
> just in case make a backup of your hbase before.
>
> Possible someone has a better advice.
>
> Thanks,
> Sergey
>
> On Fri, Mar 11, 2016 at 3:05 PM, Saurabh Agarwal (BLOOMBERG/ 731 LEX)
> <sa...@bloomberg.net> wrote:
>> Hi,
>>
>> I had been experimenting with different indexes on Phoenix table to get
>> the
>> desired performance.
>>
>> After creating secondary index that create index on one column and include
>> rest of the fields, it start throwing the following exceptions whenever I
>> access the table.
>>
>> Can you point me what might be went wrong here?
>>
>> We are using HDP 2.3 - HBase 1.1.2.2.3.2.0-2950,
>> phoenix-4.4.0.2.3.2.0-2950
>>
>> 0: jdbc:phoenix:> select count(*) from "Weather";
>> 16/03/11 17:37:32 WARN ipc.CoprocessorRpcChannel: Call failed on
>> IOException
>> org.apache.hadoop.hbase.DoNotRetryIOException:
>> org.apache.hadoop.hbase.DoNotRetryIOException: com.bloomb
>> erg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
>> ava:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> at
>>
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> at
>>
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>> va:92)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>> va:89)
>> at
>>
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
>> hannel.java:95)
>> at
>>
>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
>> aProtos.java:10665)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>> 1292)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>> 1279)
>> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by:
>>
>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
>> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
>> com.bloomberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>> at
>>
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>> at
>>
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
>> thod(AbstractRpcClient.java:287)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
>> e(ClientProtos.java:32675)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
>> ... 13 more
>> 16/03/11 17:37:32 WARN client.HTable: Error calling coprocessor service
>> org.apache.phoenix.coprocessor.g enerated.MetaDataProtos$MetaDataService
>> for
>> row \x00\x00com.bloomberg.ds.WeatherSmallSalt
>> java.util.concurrent.ExecutionException:
>> org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoo
>> p.hbase.DoNotRetryIOException: com.bloomberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at java.util.concurrent.FutureTask.report(FutureTask.java:122)
>> at java.util.concurrent.FutureTask.get(FutureTask.java:192)
>> at
>> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1763)
>> at
>> org.apache.hadoop.hbase.client.HTable.coprocessorService(HTable.java:1719)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>> ervicesImpl.java:1026)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>> ervicesImpl.java:1006)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.jav
>> a:1278)
>> at
>>
>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:415)
>> at
>>
>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:358)
>> at
>>
>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:354)
>> at
>>
>> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:4
>> 13)
>> at
>>
>> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:28
>> 8)
>> at
>>
>> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:189)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>> nt.java:358)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>> nt.java:339)
>> at
>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:247)
>> at
>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:242)
>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:241)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1257)
>> at sqlline.Commands.execute(Commands.java:822)
>> at sqlline.Commands.sql(Commands.java:732)
>> at sqlline.SqlLine.dispatch(SqlLine.java:808)
>> at sqlline.SqlLine.begin(SqlLine.java:681)
>> at sqlline.SqlLine.start(SqlLine.java:398)
>> at sqlline.SqlLine.main(SqlLine.java:292)
>> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
>> org.apache.hadoop.hbase.DoNotRetryIOException:
>> com.bloomberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
>> ava:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> at
>>
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> at
>>
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>> va:92)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>> va:89)
>> at
>>
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
>> hannel.java:95)
>> at
>>
>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
>> aProtos.java:10665)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>> 1292)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>> 1279)
>> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by:
>>
>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
>> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
>> com.bloomberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>> at
>>
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>> at
>>
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
>> thod(AbstractRpcClient.java:287)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
>> e(ClientProtos.java:32675)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
>> ... 13 more
>> Error: org.apache.hadoop.hbase.DoNotRetryIOException:
>> com.bloomberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more (state=08000,code=101)
>> org.apache.phoenix.exception.PhoenixIOException:
>> org.apache.hadoop.hbase.DoNotRetryIOException: com.bloo
>> mberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at
>>
>> org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>> ervicesImpl.java:1043)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.metaDataCoprocessorExec(ConnectionQueryS
>> ervicesImpl.java:1006)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getTable(ConnectionQueryServicesImpl.jav
>> a:1278)
>> at
>>
>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:415)
>> at
>>
>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:358)
>> at
>>
>> org.apache.phoenix.schema.MetaDataClient.updateCache(MetaDataClient.java:354)
>> at
>>
>> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:4
>> 13)
>> at
>>
>> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:28
>> 8)
>> at
>>
>> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:189)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>> nt.java:358)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStateme
>> nt.java:339)
>> at
>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:247)
>> at
>> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:242)
>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:241)
>> at
>>
>> org.apache.phoenix.jdbc.PhoenixStatement.execute(PhoenixStatement.java:1257)
>> at sqlline.Commands.execute(Commands.java:822)
>> at sqlline.Commands.sql(Commands.java:732)
>> at sqlline.SqlLine.dispatch(SqlLine.java:808)
>> at sqlline.SqlLine.begin(SqlLine.java:681)
>> at sqlline.SqlLine.start(SqlLine.java:398)
>> at sqlline.SqlLine.main(SqlLine.java:292)
>> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
>> org.apache.hadoop.hbase.DoNotRetryIOException:
>> com.bloomberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>> at
>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> at
>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.j
>> ava:45)
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>> at
>>
>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>> at
>>
>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:325)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1622)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>> va:92)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.ja
>> va:89)
>> at
>>
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>> at
>>
>> org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcC
>> hannel.java:95)
>> at
>>
>> org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDat
>> aProtos.java:10665)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>> 1292)
>> at
>>
>> org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:
>> 1279)
>> at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1751)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by:
>>
>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOExc
>> eption): org.apache.hadoop.hbase.DoNotRetryIOException:
>> com.bloomberg.ds.WeatherSmallSalt: 35
>> at
>> org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:84)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:447)
>> at
>>
>> org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataPr
>> otos.java:10505)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7435)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:187
>> 5)
>> at
>>
>> org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1857)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(Cl
>> ientProtos.java:32209)
>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>> at
>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ArrayIndexOutOfBoundsException: 35
>> at org.apache.phoenix.schema.PTableImpl.init(PTableImpl.java:354)
>> at org.apache.phoenix.schema.PTableImpl.<init>(PTableImpl.java:276)
>> at org.apache.phoenix.schema.PTableImpl.makePTable(PTableImpl.java:265)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:826)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.addIndexToTable(MetaDataEndpointImpl.java
>> :526)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:803)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.buildTable(MetaDataEndpointImpl.java:462)
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1696
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.doGetTable(MetaDataEndpointImpl.java:1643
>> )
>> at
>>
>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl.getTable(MetaDataEndpointImpl.java:430)
>> ... 10 more
>>
>> at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
>> at
>>
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>> at
>>
>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMe
>> thod(AbstractRpcClient.java:287)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execServic
>> e(ClientProtos.java:32675)
>> at
>>
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1618)
>> ... 13 more
>>