You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by "Wang, Xinglong (JIRA)" <ji...@apache.org> on 2018/01/10 08:53:00 UTC

[jira] [Created] (HBASE-19750) print pretty rowkey in getExceptionMessageAdditionalDetail

Wang, Xinglong created HBASE-19750:
--------------------------------------

             Summary: print pretty rowkey in getExceptionMessageAdditionalDetail
                 Key: HBASE-19750
                 URL: https://issues.apache.org/jira/browse/HBASE-19750
             Project: HBase
          Issue Type: Improvement
          Components: Client
            Reporter: Wang, Xinglong
            Assignee: Wang, Xinglong
            Priority: Minor


Sometimes the rowkey is binary format and is not able to print out human readable string. In this case, the exception will still try to call the toString() method and result in something like '�\(�\'. 

It will be very inefficient to trouble shooting the issue when we get such kind of exception. We can't identify the problematic row key based on the printout. 

The idea here is that print out the rowkey use Bytes.toStringBinary() in additional with Bytes.toString(row). 

If the row is serialized from human readable string, then Bytes.toString(row) makes more sense. When it's from human unreadable string, then Bytes.toStringBinary(row) will help.

The output of Bytes.toStringBinary(row) anyway can be applied to hbase shell to do the scan so that we can easily identify the corresponding row.

{code:java}
2017-12-16 07:25:41,304 INFO [main] org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl: recovered from org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Sat Dec 16 07:25:41 GMT-07:00 2017, null, java.net.SocketTimeoutException: callTimeout=250000, callDuration=250473: row '�\(�\' on table 'mytable' at region=mytable,\xDF\x5C(\xF5\xC2\x8F\x5C\x1B,1412216342143.5d74ce411eecd40001d9bf6e62f0b607., hostname=mycluster.internal.xx.com,60020,1503881012672, seqNum=6265890293

	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:271)
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:203)
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:320)
	at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:403)
	at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:364)
	at org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl.nextKeyValue(TableRecordReaderImpl.java:205)
	at org.apache.hadoop.hbase.mapreduce.TableRecordReader.nextKeyValue(TableRecordReader.java:147)
	at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase$1.nextKeyValue(TableInputFormatBase.java:216)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
{code}

Current code:
RegionServerCallable.java
{code:java}
 public String getExceptionMessageAdditionalDetail() {
    return "row '" + Bytes.toString(row) + "' on table '" + tableName + "' at " + location;
  }
{code}




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)