You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Ninel Hodzic <ni...@gmail.com> on 2016/06/04 00:09:07 UTC

Re: Issues while running psql.py localhost command

@Ashutosh

Did you solve this? I think i have similar issue with Phoenix
4.7.0/Hbase.1.1

I tried with phoenix-core-4.7.0...jar and it worked but not supporting JOIN
(
http://stackoverflow.com/questions/30730643/phoenix-join-operation-not-working-with-hbase)
and then I replaced it with phoenix-server-4.7.0...jar but it throws sanity
check exception.

Any help on this?

On Mon, Sep 21, 2015 at 3:59 PM, Ashutosh Sharma <
ashu.sharma.india@gmail.com> wrote:

> What all phoenix*server*.jar needs to be copied into HBase Lib directory?
>
> I think i m missing something on that front...on my ubuntu laptop...the
> same stuff with the same version is working...but not on my ubuntu
> laptop...i think i haven't copied the correct JAR file into HBase lib.
>
> On Mon, Sep 21, 2015 at 2:12 AM, rajeshbabu@apache.org <
> chrajeshbabu32@gmail.com> wrote:
>
>> @James
>>
>> You can check this https://issues.apache.org/jira/browse/HBASE-10591 for
>> more information.
>> Some sanity checks of table attributes are failing.
>>
>> @Ashutosh  you can raise an issue to validate the table attributes which
>> are not meeting minimum criteria or else you can specify them as table
>> properties while creating the table.
>>
>>
>> On Mon, Sep 21, 2015 at 2:15 PM, James Heather <
>> james.heather@mendeley.com> wrote:
>>
>>> I don't know for certain what that parameter does but it sounds a bit
>>> scary to me...
>>>
>>>
>>> On 21/09/15 09:41, rajeshbabu@apache.org wrote:
>>>
>>> You can try adding below property to hbase-site.xml and restart hbase.
>>> <property>
>>> <name>hbase.table.sanity.checks</name>
>>> <value>false</value>
>>> </property>
>>>
>>> Thanks,
>>> Rajeshbabu.
>>>
>>> On Mon, Sep 21, 2015 at 12:51 PM, Ashutosh Sharma <
>>> ashu.sharma.india@gmail.com> wrote:
>>>
>>>> I am getting into issues while running phoenix psql.py command against
>>>> my local Hbase instance.
>>>>
>>>> Local HBase is running perfectly fine. Any help?
>>>>
>>>> root@ashu-HP-ENVY-15-Notebook-PC:/phoenix-4.5.2-HBase-1.1-bin/bin#
>>>> ./psql.py localhost /phoenix-4.5.2-HBase-1.1-src/examples/STOCK_SYMBOL.sql
>>>> 15/09/21 00:19:26 WARN util.NativeCodeLoader: Unable to load
>>>> native-hadoop library for your platform... using builtin-java classes where
>>>> applicable
>>>> org.apache.phoenix.exception.PhoenixIOException:
>>>> org.apache.hadoop.hbase.DoNotRetryIOException: Class
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be loaded Set
>>>> hbase.table.sanity.checks to false at conf or table descriptor if you want
>>>> to bypass sanity checks
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>>>> at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
>>>> at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:889)
>>>> at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1223)
>>>> at
>>>> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:113)
>>>> at
>>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1937)
>>>> at
>>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:751)
>>>> at
>>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:186)
>>>> at
>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:320)
>>>> at
>>>> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:312)
>>>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>>> at
>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:310)
>>>> at
>>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1422)
>>>> at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1927)
>>>> at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:1896)
>>>> at
>>>> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:77)
>>>> at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1896)
>>>> at
>>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:180)
>>>> at
>>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:132)
>>>> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:151)
>>>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>>> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>>> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:192)
>>>> Caused by: org.apache.hadoop.hbase.DoNotRetryIOException:
>>>> org.apache.hadoop.hbase.DoNotRetryIOException: Class
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be loaded Set
>>>> hbase.table.sanity.checks to false at conf or table descriptor if you want
>>>> to bypass sanity checks
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>>>> at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>> at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>>> at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>>> at
>>>> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
>>>> at
>>>> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
>>>> at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
>>>> at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:636)
>>>> at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:557)
>>>> at
>>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:850)
>>>> ... 20 more
>>>> Caused by:
>>>> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException):
>>>> org.apache.hadoop.hbase.DoNotRetryIOException: Class
>>>> org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be loaded Set
>>>> hbase.table.sanity.checks to false at conf or table descriptor if you want
>>>> to bypass sanity checks
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:1597)
>>>> at
>>>> org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1529)
>>>> at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1448)
>>>> at
>>>> org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:422)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:48502)
>>>> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
>>>> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
>>>> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> at
>>>> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1196)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
>>>> at
>>>> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
>>>> at
>>>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:51086)
>>>> at
>>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1802)
>>>> at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:641)
>>>> at org.apache.hadoop.hbase.client.HBaseAdmin$4.call(HBaseAdmin.java:637)
>>>> at
>>>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
>>>> ... 24 more
>>>> root@ashu-HP-ENVY-15-Notebook-PC:/phoenix-4.5.2-HBase-1.1-bin/bin#
>>>>
>>>>
>>>> --
>>>> With best Regards:
>>>> Ashutosh Sharma
>>>>
>>>
>>>
>>>
>>
>
>
> --
> With best Regards:
> Ashutosh Sharma
>