You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@phoenix.apache.org by "Ievgen Nekrashevych (JIRA)" <ji...@apache.org> on 2018/11/22 08:35:00 UTC

[jira] [Updated] (PHOENIX-5041) can't create local index due to UpgradeRequiredException

     [ https://issues.apache.org/jira/browse/PHOENIX-5041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ievgen Nekrashevych updated PHOENIX-5041:
-----------------------------------------
    Description: 
Having isNamespaceMappingEnabled property set on both client and server launching creation of local index on a fresh start issues UpgradeRequiredException, however launching EXECUTE UPGRADE says no upgrade required:
{code}
sql> create local index if not exists "BLATEST_INDEX" on BLA2."test" (STR,STARTTIME)
[2018-11-22 09:39:47] [00000][-1] Error -1 (00000) : Error while executing SQL "create local index if not exists "BLATEST_INDEX" on BLA2."test" (STR,STARTTIME)": Remote driver error: RuntimeException: java.sql.SQLException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX -> SQLException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX -> RemoteWithExtrasException: org.apache.hadoop.hbase.DoNotRetryIOException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:112)
[2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1812)
[2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16372)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7996)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1986)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1968)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33652)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2191)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)
[2018-11-22 09:39:47] Caused by: org.apache.phoenix.exception.UpgradeRequiredException: Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.
[2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2596)
[2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2499)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
[2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2499)
[2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
[2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
[2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
[2018-11-22 09:39:47] 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
[2018-11-22 09:39:47] 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnectionOnServer(QueryUtil.java:379)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnectionOnServer(QueryUtil.java:360)
[2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1739)
[2018-11-22 09:39:47] 	... 9 more
{code}

Reproducable with script (launched through the query server):
{code}
create schema if not exists TS
create table if not exists TS.TEST (STR varchar not null,INTCOL bigint not null, STARTTIME integer, DUMMY integer default 0 CONSTRAINT PK PRIMARY KEY (STR, INTCOL))
create local index if not exists "TEST_INDEX" on TS.TEST (STR,STARTTIME)
{code}

Note: this is a fresh installation of 4.14.1 on top of cdh5.14.2.

Didn't test it on phoenix 5.0. Pretty sure this is the result of PHOENIX-4579

  was:
Having isNamespaceMappingEnabled property set on both client and server launching creation of local index on a fresh start issues UpgradeRequiredException, however launching EXECUTE UPGRADE says no upgrade required:
{code}
sql> create local index if not exists "BLATEST_INDEX" on BLA2."test" (STR,STARTTIME)
[2018-11-22 09:39:47] [00000][-1] Error -1 (00000) : Error while executing SQL "create local index if not exists "BLATEST_INDEX" on BLA2."test" (STR,STARTTIME)": Remote driver error: RuntimeException: java.sql.SQLException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX -> SQLException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX -> RemoteWithExtrasException: org.apache.hadoop.hbase.DoNotRetryIOException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:112)
[2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1812)
[2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16372)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7996)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1986)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1968)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33652)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2191)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)
[2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)
[2018-11-22 09:39:47] Caused by: org.apache.phoenix.exception.UpgradeRequiredException: Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.
[2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2596)
[2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2499)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
[2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2499)
[2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
[2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
[2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
[2018-11-22 09:39:47] 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
[2018-11-22 09:39:47] 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnectionOnServer(QueryUtil.java:379)
[2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnectionOnServer(QueryUtil.java:360)
[2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1739)
[2018-11-22 09:39:47] 	... 9 more
{code}

Reproducable with script:
{code}
create schema if not exists TS
create table if not exists TS.TEST (STR varchar not null,INTCOL bigint not null, STARTTIME integer, DUMMY integer default 0 CONSTRAINT PK PRIMARY KEY (STR, INTCOL))
create local index if not exists "TEST_INDEX" on TS.TEST (STR,STARTTIME)
{code}

Note: this is a fresh installation of 4.14.1 on top of cdh5.14.2.

Didn't test it on phoenix 5.0. Pretty sure this is the result of PHOENIX-4579


> can't create local index due to UpgradeRequiredException
> --------------------------------------------------------
>
>                 Key: PHOENIX-5041
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-5041
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 4.14.0, 4.14.1
>            Reporter: Ievgen Nekrashevych
>            Priority: Major
>
> Having isNamespaceMappingEnabled property set on both client and server launching creation of local index on a fresh start issues UpgradeRequiredException, however launching EXECUTE UPGRADE says no upgrade required:
> {code}
> sql> create local index if not exists "BLATEST_INDEX" on BLA2."test" (STR,STARTTIME)
> [2018-11-22 09:39:47] [00000][-1] Error -1 (00000) : Error while executing SQL "create local index if not exists "BLATEST_INDEX" on BLA2."test" (STR,STARTTIME)": Remote driver error: RuntimeException: java.sql.SQLException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX -> SQLException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX -> RemoteWithExtrasException: org.apache.hadoop.hbase.DoNotRetryIOException: ERROR 2011 (INT13): Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.  BLA2:BLATEST_INDEX
> [2018-11-22 09:39:47] 	at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:112)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1812)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16372)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:7996)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execServiceOnRegion(RSRpcServices.java:1986)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.regionserver.RSRpcServices.execService(RSRpcServices.java:1968)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33652)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2191)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:183)
> [2018-11-22 09:39:47] 	at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:163)
> [2018-11-22 09:39:47] Caused by: org.apache.phoenix.exception.UpgradeRequiredException: Operation not allowed since cluster hasn't been upgraded. Call EXECUTE UPGRADE.
> [2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2596)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2499)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2499)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221)
> [2018-11-22 09:39:47] 	at java.sql.DriverManager.getConnection(DriverManager.java:664)
> [2018-11-22 09:39:47] 	at java.sql.DriverManager.getConnection(DriverManager.java:208)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:400)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnectionOnServer(QueryUtil.java:379)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.util.QueryUtil.getConnectionOnServer(QueryUtil.java:360)
> [2018-11-22 09:39:47] 	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.createTable(MetaDataEndpointImpl.java:1739)
> [2018-11-22 09:39:47] 	... 9 more
> {code}
> Reproducable with script (launched through the query server):
> {code}
> create schema if not exists TS
> create table if not exists TS.TEST (STR varchar not null,INTCOL bigint not null, STARTTIME integer, DUMMY integer default 0 CONSTRAINT PK PRIMARY KEY (STR, INTCOL))
> create local index if not exists "TEST_INDEX" on TS.TEST (STR,STARTTIME)
> {code}
> Note: this is a fresh installation of 4.14.1 on top of cdh5.14.2.
> Didn't test it on phoenix 5.0. Pretty sure this is the result of PHOENIX-4579



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)