You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Kang Minwoo <mi...@outlook.com> on 2019/05/07 09:23:29 UTC

Why HBase client retry even though AccessDeniedException

Hello User.

(HBase version: 1.2.9)

Recently, I am testing about DoNotRetryIOException.

I expected when RegionServer send a DoNotRetryIOException (or AccessDeniedException), Client does not retry.
But, In Spark or MR, Client retries even though they receive AccessDeniedException.

Here is a call stack.

Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts={}, exceptions: {time}, null, java.net.SocketTimeoutException: {detail info}
...
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: the client is not authorized
    at (... coprocessor throw AccessDeniedException)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$50.call(RegionCoprocessorHost.java:1300)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1722)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preScannerOpen(RegionCoprocessorHost.java:1295)
	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2468)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33770)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2216)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:748)

	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1272)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34216)
	at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:400)
	... 10 more

The client can not aware of AccessDeniedException because the exception is RemoteWithExtrasException.
I wonder it is a bug.

Best regards,
Minwoo Kang

Re: Why HBase client retry even though AccessDeniedException

Posted by Kang Minwoo <mi...@outlook.com>.
Thanks! It is already fixed in HBASE-17170.

________________________________________
보낸 사람: Ankit Singhal <an...@gmail.com>
보낸 날짜: 2019년 5월 8일 수요일 02:50
받는 사람: user@hbase.apache.org
제목: Re: Why HBase client retry even though AccessDeniedException

Yes, you also might be hitting
https://issues.apache.org/jira/browse/HBASE-17170

On Tue, May 7, 2019 at 10:33 AM Josh Elser <el...@apache.org> wrote:

> Sounds like a bug to me.
>
> On 5/7/19 5:52 AM, Kang Minwoo wrote:
> > Why do not use "doNotRetry" value in RemoteWithExtrasException?
> >
> > ________________________________________
> > 보낸 사람: Kang Minwoo <mi...@outlook.com>
> > 보낸 날짜: 2019년 5월 7일 화요일 18:23
> > 받는 사람: user@hbase.apache.org
> > 제목: Why HBase client retry even though AccessDeniedException
> >
> > Hello User.
> >
> > (HBase version: 1.2.9)
> >
> > Recently, I am testing about DoNotRetryIOException.
> >
> > I expected when RegionServer send a DoNotRetryIOException (or
> AccessDeniedException), Client does not retry.
> > But, In Spark or MR, Client retries even though they receive
> AccessDeniedException.
> >
> > Here is a call stack.
> >
> > Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException:
> Failed after attempts={}, exceptions: {time}, null, java.net.SocketTimeoutException:
> {detail info}
> > ...
> > Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
> org.apache.hadoop.hbase.security.AccessDeniedException: the client is not
> authorized
> >      at (... coprocessor throw AccessDeniedException)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$50.call(RegionCoprocessorHost.java:1300)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1722)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preScannerOpen(RegionCoprocessorHost.java:1295)
> >          at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2468)
> >          at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33770)
> >          at
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2216)
> >          at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
> >          at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
> >          at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
> >          at java.lang.Thread.run(Thread.java:748)
> >
> >          at
> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1272)
> >          at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> >          at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> >          at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34216)
> >          at
> org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:400)
> >          ... 10 more
> >
> > The client can not aware of AccessDeniedException because the exception
> is RemoteWithExtrasException.
> > I wonder it is a bug.
> >
> > Best regards,
> > Minwoo Kang
> >
>

Re: Why HBase client retry even though AccessDeniedException

Posted by Ankit Singhal <an...@gmail.com>.
Yes, you also might be hitting
https://issues.apache.org/jira/browse/HBASE-17170

On Tue, May 7, 2019 at 10:33 AM Josh Elser <el...@apache.org> wrote:

> Sounds like a bug to me.
>
> On 5/7/19 5:52 AM, Kang Minwoo wrote:
> > Why do not use "doNotRetry" value in RemoteWithExtrasException?
> >
> > ________________________________________
> > 보낸 사람: Kang Minwoo <mi...@outlook.com>
> > 보낸 날짜: 2019년 5월 7일 화요일 18:23
> > 받는 사람: user@hbase.apache.org
> > 제목: Why HBase client retry even though AccessDeniedException
> >
> > Hello User.
> >
> > (HBase version: 1.2.9)
> >
> > Recently, I am testing about DoNotRetryIOException.
> >
> > I expected when RegionServer send a DoNotRetryIOException (or
> AccessDeniedException), Client does not retry.
> > But, In Spark or MR, Client retries even though they receive
> AccessDeniedException.
> >
> > Here is a call stack.
> >
> > Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException:
> Failed after attempts={}, exceptions: {time}, null, java.net.SocketTimeoutException:
> {detail info}
> > ...
> > Caused by:
> org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException):
> org.apache.hadoop.hbase.security.AccessDeniedException: the client is not
> authorized
> >      at (... coprocessor throw AccessDeniedException)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$50.call(RegionCoprocessorHost.java:1300)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1722)
> >          at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preScannerOpen(RegionCoprocessorHost.java:1295)
> >          at
> org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2468)
> >          at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33770)
> >          at
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2216)
> >          at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
> >          at
> org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
> >          at
> org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
> >          at java.lang.Thread.run(Thread.java:748)
> >
> >          at
> org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1272)
> >          at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
> >          at
> org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
> >          at
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34216)
> >          at
> org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:400)
> >          ... 10 more
> >
> > The client can not aware of AccessDeniedException because the exception
> is RemoteWithExtrasException.
> > I wonder it is a bug.
> >
> > Best regards,
> > Minwoo Kang
> >
>

Re: Why HBase client retry even though AccessDeniedException

Posted by Josh Elser <el...@apache.org>.
Sounds like a bug to me.

On 5/7/19 5:52 AM, Kang Minwoo wrote:
> Why do not use "doNotRetry" value in RemoteWithExtrasException?
> 
> ________________________________________
> 보낸 사람: Kang Minwoo <mi...@outlook.com>
> 보낸 날짜: 2019년 5월 7일 화요일 18:23
> 받는 사람: user@hbase.apache.org
> 제목: Why HBase client retry even though AccessDeniedException
> 
> Hello User.
> 
> (HBase version: 1.2.9)
> 
> Recently, I am testing about DoNotRetryIOException.
> 
> I expected when RegionServer send a DoNotRetryIOException (or AccessDeniedException), Client does not retry.
> But, In Spark or MR, Client retries even though they receive AccessDeniedException.
> 
> Here is a call stack.
> 
> Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts={}, exceptions: {time}, null, java.net.SocketTimeoutException: {detail info}
> ...
> Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: the client is not authorized
>      at (... coprocessor throw AccessDeniedException)
>          at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$50.call(RegionCoprocessorHost.java:1300)
>          at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
>          at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)
>          at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1722)
>          at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preScannerOpen(RegionCoprocessorHost.java:1295)
>          at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2468)
>          at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33770)
>          at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2216)
>          at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
>          at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
>          at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
>          at java.lang.Thread.run(Thread.java:748)
> 
>          at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1272)
>          at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
>          at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
>          at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34216)
>          at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:400)
>          ... 10 more
> 
> The client can not aware of AccessDeniedException because the exception is RemoteWithExtrasException.
> I wonder it is a bug.
> 
> Best regards,
> Minwoo Kang
> 

Re: Why HBase client retry even though AccessDeniedException

Posted by Kang Minwoo <mi...@outlook.com>.
Why do not use "doNotRetry" value in RemoteWithExtrasException?

________________________________________
보낸 사람: Kang Minwoo <mi...@outlook.com>
보낸 날짜: 2019년 5월 7일 화요일 18:23
받는 사람: user@hbase.apache.org
제목: Why HBase client retry even though AccessDeniedException

Hello User.

(HBase version: 1.2.9)

Recently, I am testing about DoNotRetryIOException.

I expected when RegionServer send a DoNotRetryIOException (or AccessDeniedException), Client does not retry.
But, In Spark or MR, Client retries even though they receive AccessDeniedException.

Here is a call stack.

Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts={}, exceptions: {time}, null, java.net.SocketTimeoutException: {detail info}
...
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: the client is not authorized
    at (... coprocessor throw AccessDeniedException)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$50.call(RegionCoprocessorHost.java:1300)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost$RegionOperation.call(RegionCoprocessorHost.java:1673)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperation(RegionCoprocessorHost.java:1749)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.execOperationWithResult(RegionCoprocessorHost.java:1722)
        at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.preScannerOpen(RegionCoprocessorHost.java:1295)
        at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2468)
        at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33770)
        at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2216)
        at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
        at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
        at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
        at java.lang.Thread.run(Thread.java:748)

        at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1272)
        at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
        at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
        at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34216)
        at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:400)
        ... 10 more

The client can not aware of AccessDeniedException because the exception is RemoteWithExtrasException.
I wonder it is a bug.

Best regards,
Minwoo Kang