You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@phoenix.apache.org by "Istvan Toth (Jira)" <ji...@apache.org> on 2022/04/11 11:21:00 UTC

[jira] [Comment Edited] (PHOENIX-6684) When hbase pre-partition and Phoenix map the table,we select by local index will failed。

    [ https://issues.apache.org/jira/browse/PHOENIX-6684?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17520504#comment-17520504 ] 

Istvan Toth edited comment on PHOENIX-6684 at 4/11/22 11:20 AM:
----------------------------------------------------------------

Apache Phoenix 5.0 does not work with HBase 2.2.1 (in fact HBase 2.2.1 is not supported with any Apache Phoenix version)

If this is a vendor version, please contact your vendor.



Does this happen on the Apache Phoenix 5.1.2, or master HEAD release?


was (Author: stoty):
Apache Phoenix 5.0 does not work with HBase 2.2.1 (in fact HBase 2.2.1 is not supported with any Phoenix version)

Does this happen on Phoenix 5.1.2 ?

> When hbase pre-partition and Phoenix map the table,we select by local index will failed。
> ----------------------------------------------------------------------------------------
>
>                 Key: PHOENIX-6684
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-6684
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 5.0.0
>            Reporter: sjc
>            Priority: Major
>
> When hbase pre-partition and Phoenix map the table,we select by local index will failed。
> HBase version:2.2.1
> Phoenix version:5.0.0
> {code:java}
> hbase-side:
> create 'LJC.STUDENT' ,{NAME =>'F', COMPRESSION =>'SNAPPY'},SPLITS => ['200000', '400000', '600000', '800000']
> Phoenix-side:
> create table if not exists ljc.student(id integer primary key, F.name varchar(20), F.age varchar(20), F.SEX VARCHAR) column_encoded_bytes=0;
> upsert into ljc.student(id,name,age,SEX) values(1,'zhangsan','12','M');
> upsert into ljc.student(id,name,age,SEX) values(2,'lisi','13','W');
> create local index LOCAL_INDEX_STUDENT_1 on LJC.STUDENT(SEX,NAME);
> select * from ljc.student where AGE='12';  # the select command successed
> select * from ljc.student where SEX='M';   #  the select command failed {code}
> {code:java}
> 0: jdbc:phoenix:> select * from ljc.student where SEX='M';
> org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: LJC.STUDENT,800000,1649399468066.0c83706d18d87c40b5f04cbc045c21bc.: Some redundant bytes in KeyValue's buffer, startOffset=46, endOffset=50, KeyValueBytesHex=\x00\x00\x00"\x00\x00\x00\x02\x00\x11\x00\x00M\x00zhangsan\x00\x80\x00\x00\x01\x03L#0_0\x00\x00\x01\x80\x07\xDF<\x1A\x04_0\x00\x00\x00\x00\x00\x00, offset=0, length=50
>         at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:114)
>         at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:80)
>         at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:213)
>         at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:77)
>         at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:77)
>         at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:274)
>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:3192)
>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:3437)
>         at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42278)
>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:434)
>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:132)
>         at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:338)
>         at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:318)
> Caused by: java.lang.IllegalArgumentException: Some redundant bytes in KeyValue's buffer, startOffset=46, endOffset=50, KeyValueBytesHex=\x00\x00\x00"\x00\x00\x00\x02\x00\x11\x00\x00M\x00zhangsan\x00\x80\x00\x00\x01\x03L#0_0\x00\x00\x01\x80\x07\xDF<\x1A\x04_0\x00\x00\x00\x00\x00\x00, offset=0, length=50
>         at org.apache.hadoop.hbase.KeyValueUtil.checkKeyValueBytes(KeyValueUtil.java:649)
>         at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:344)
>         at org.apache.hadoop.hbase.KeyValueUtil.copyToNewKeyValue(KeyValueUtil.java:98)
>         at org.apache.phoenix.util.PhoenixKeyValueUtil.maybeCopyCell(PhoenixKeyValueUtil.java:215)
>         at org.apache.phoenix.schema.tuple.ResultTuple.getValue(ResultTuple.java:93)
>         at org.apache.phoenix.schema.tuple.ResultTuple.getValue(ResultTuple.java:35)
>         at org.apache.phoenix.execute.TupleProjector.projectResults(TupleProjector.java:282)
>         at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:203)
>         ... 10 more        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.instantiateException(RemoteWithExtrasException.java:99)
>         at org.apache.hadoop.hbase.ipc.RemoteWithExtrasException.unwrapRemoteException(RemoteWithExtrasException.java:89)
>         at org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil.makeIOExceptionOfException(ProtobufUtil.java:363)
>         at org.apache.hadoop.hbase.shaded.protobuf.ProtobufUtil.handleRemoteException(ProtobufUtil.java:351)
>         at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:344)
>         at org.apache.hadoop.hbase.client.ScannerCallable.rpcCall(ScannerCallable.java:242)
>         at org.apache.hadoop.hbase.client.ScannerCallable.rpcCall(ScannerCallable.java:58)
>         at org.apache.hadoop.hbase.client.RegionServerCallable.call(RegionServerCallable.java:127)
>         at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
>         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:387)
>         at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:361)
>         at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)
>         at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryIOException: LJC.STUDENT,800000,1649399468066.0c83706d18d87c40b5f04cbc045c21bc.: Some redundant bytes in KeyValue's buffer, startOffset=46, endOffset=50, KeyValueBytesHex=\x00\x00\x00"\x00\x00\x00\x02\x00\x11\x00\x00M\x00zhangsan\x00\x80\x00\x00\x01\x03L#0_0\x00\x00\x01\x80\x07\xDF<\x1A\x04_0\x00\x00\x00\x00\x00\x00, offset=0, length=50
>         at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:114)
>         at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:80)
>         at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:213)
>         at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:77)
>         at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:77)
>         at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:274)
>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:3192)
>         at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:3437)
>         at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42278)
>         at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:434)
>         at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:132)
>         at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:338)
>         at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:318)
> Caused by: java.lang.IllegalArgumentException: Some redundant bytes in KeyValue's buffer, startOffset=46, endOffset=50, KeyValueBytesHex=\x00\x00\x00"\x00\x00\x00\x02\x00\x11\x00\x00M\x00zhangsan\x00\x80\x00\x00\x01\x03L#0_0\x00\x00\x01\x80\x07\xDF<\x1A\x04_0\x00\x00\x00\x00\x00\x00, offset=0, length=50
>         at org.apache.hadoop.hbase.KeyValueUtil.checkKeyValueBytes(KeyValueUtil.java:649)
>         at org.apache.hadoop.hbase.KeyValue.<init>(KeyValue.java:344)
>         at org.apache.hadoop.hbase.KeyValueUtil.copyToNewKeyValue(KeyValueUtil.java:98)
>         at org.apache.phoenix.util.PhoenixKeyValueUtil.maybeCopyCell(PhoenixKeyValueUtil.java:215)
>         at org.apache.phoenix.schema.tuple.ResultTuple.getValue(ResultTuple.java:93)
>         at org.apache.phoenix.schema.tuple.ResultTuple.getValue(ResultTuple.java:35)
>         at org.apache.phoenix.execute.TupleProjector.projectResults(TupleProjector.java:282)
>         at org.apache.phoenix.iterate.RegionScannerFactory$1.nextRaw(RegionScannerFactory.java:203)
>         ... 10 more        at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:389)
>         at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:97)
>         at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:423)
>         at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:419)
>         at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:103)
>         at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:118)
>         at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.readResponse(NettyRpcDuplexHandler.java:165)
>         at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelRead(NettyRpcDuplexHandler.java:203)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:337)
>         at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
>         at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:337)
>         at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:337)
>         at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359)
>         at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345)
>         at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930)
>         at org.apache.hbase.thirdparty.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
>         at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677)
>         at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612)
>         at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529)
>         at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491)
>         at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
>         at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>         ... 1 more{code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)