You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by xish <li...@xish.com> on 2016/10/12 03:21:50 UTC

hive integrate with base error

hi,
	hive has integrated with hbase, and i can create  and query external hbase table by hive client,but when i use beeline to operate hbase  sth wrong , i guess this issue is hive jdbc error, pls give me
some advice  many thx , here is the error msg :


2016-10-12T10:42:31,803 DEBUG [HiveServer2-Handler-Pool: Thread-115]: security.UserGroupInformation (UserGroupInformation.java:doAs(1702)) - PrivilegedActionException as:hadoop (auth:PROXY) via hadoop (auth:SIMPLE) cause:org.apache.hive.service.cli.HiveSQLException: java.io.IOException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Wed Oct 12 10:42:31 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68422: row 'data_visit_year_2,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=slave3,16020,1476072663759, seqNum=0

2016-10-12T10:42:31,803 WARN  [HiveServer2-Handler-Pool: Thread-115]: thrift.ThriftCLIService (ThriftCLIService.java:FetchResults(702)) - Error fetching results: 
org.apache.hive.service.cli.HiveSQLException: java.io.IOException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Wed Oct 12 10:42:31 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68422: row 'data_visit_year_2,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=slave3,16020,1476072663759, seqNum=0

	at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:361) ~[hive-service-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:241) ~[hive-service-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:777) ~[hive-service-2.0.1.jar:2.0.1]
	at sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source) ~[?:?]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_20]
	at java.lang.reflect.Method.invoke(Method.java:483) ~[?:1.8.0_20]
	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.0.1.jar:2.0.1]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_20]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_20]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) ~[hadoop-common-2.7.3.jar:?]
	at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.0.1.jar:2.0.1]
	at com.sun.proxy.$Proxy35.fetchResults(Unknown Source) ~[?:?]
	at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:439) ~[hive-service-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:693) [hive-service-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1557) [hive-service-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1542) [hive-service-2.0.1.jar:2.0.1]
	at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) [hive-exec-2.0.1.jar:2.0.1]
	at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) [hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) [hive-service-2.0.1.jar:2.0.1]
	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) [hive-exec-2.0.1.jar:2.0.1]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_20]
	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_20]
Caused by: java.io.IOException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Wed Oct 12 10:42:31 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68422: row 'data_visit_year_2,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=slave3,16020,1476072663759, seqNum=0

	at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:513) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:420) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:145) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1850) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:356) ~[hive-service-2.0.1.jar:2.0.1]
	... 24 more
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Wed Oct 12 10:42:31 CST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68422: row 'data_visit_year_2,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=slave3,16020,1476072663759, seqNum=0

	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:276) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:210) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:210) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:326) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:301) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:166) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:161) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:797) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:193) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:89) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.MetaScanner.allTableRegions(MetaScanner.java:324) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.HRegionLocator.getAllRegionLocations(HRegionLocator.java:89) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.util.RegionSizeCalculator.init(RegionSizeCalculator.java:94) ~[hbase-server-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.util.RegionSizeCalculator.<init>(RegionSizeCalculator.java:81) ~[hbase-server-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:256) ~[hbase-server-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplitsInternal(HiveHBaseTableInputFormat.java:501) ~[hive-hbase-handler-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:433) ~[hive-hbase-handler-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:368) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:300) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:451) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:420) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:145) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1850) ~[hive-exec-2.0.1.jar:2.0.1]
	at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:356) ~[hive-service-2.0.1.jar:2.0.1]
	... 24 more
Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=68422: row 'data_visit_year_2,,00000000000000' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=slave3,16020,1476072663759, seqNum=0
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:169) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65) ~[hbase-client-1.2.3.jar:1.2.3]
	... 3 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to slave3/10.0.5.58:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to slave3/10.0.5.58:16020 is closing. Call id=11, waitTime=2
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.wrapException(AbstractRpcClient.java:289) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1271) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094) ~[hbase-protocol-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:394) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:203) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:64) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:210) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:364) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:338) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65) ~[hbase-client-1.2.3.jar:1.2.3]
	... 3 more
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to slave3/10.0.5.58:16020 is closing. Call id=11, waitTime=2
	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.cleanupCalls(RpcClientImpl.java:1084) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.close(RpcClientImpl.java:863) ~[hbase-client-1.2.3.jar:1.2.3]
	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.run(RpcClientImpl.java:580) ~[hbase-client-1.2.3.jar:1.2.3]
2016-10-12T10:42:31,809 DEBUG [HiveServer2-Handler-Pool: Thread-115]: transport.TSaslTransport (TSaslTransport.java:flush(498)) - writing data length: 8170
2016-10-12T10:42:31,823 DEBUG [HiveServer2-Handler-Pool: Thread-115]: transport.TSaslTransport (TSaslTransport.java:readFrame(459)) - SERVER: reading data length: 117
2016-10-12T10:42:31,824 DEBUG [HiveServer2-Handler-Pool: Thread-115]: security.UserGroupInformation (UserGroupInformation.java:logPrivilegedAction(1722)) - PrivilegedAction as:hadoop (auth:PROXY) via hadoop (auth:SIMPLE) from:org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
2016-10-12T10:42:31,824 INFO  [HiveServer2-Handler-Pool: Thread-115]: conf.HiveConf (HiveConf.java:getLogIdVar(3177)) - Using the default value passed in for log id: aa2fdf23-ec0d-487f-94da-2584ed1e2acb
2016-10-12T10:42:31,824 INFO  [HiveServer2-Handler-Pool: Thread-115]: session.SessionState (SessionState.java:updateThreadName(406)) - Updating thread name to aa2fdf23-ec0d-487f-94da-2584ed1e2acb HiveServer2-Handler-Pool: Thread-115
2016-10-12T10:42:31,825 INFO  [aa2fdf23-ec0d-487f-94da-2584ed1e2acb HiveServer2-Handler-Pool: Thread-115]: conf.HiveConf (HiveConf.java:getLogIdVar(3177)) - Using the default value passed in for log id: aa2fdf23-ec0d-487f-94da-2584ed1e2acb
2016-10-12T10:42:31,825 INFO  [aa2fdf23-ec0d-487f-94da-2584ed1e2acb HiveServer2-Handler-Pool: Thread-115]: session.SessionState (SessionState.java:resetThreadName(417)) - Resetting thread name to  HiveServer2-Handler-Pool: Thread-115
2016-10-12T10:42:31,825 DEBUG [HiveServer2-Handler-Pool: Thread-115]: cli.CLIService (CLIService.java:fetchResults(441)) - OperationHandle [opType=EXECUTE_STATEMENT, getHandleIdentifier()=ab9981a6-baf1-4306-b7a5-ecdf321c6394]: fetchResults()
2016-10-12T10:42:31,825 DEBUG [HiveServer2-Handler-Pool: Thread-115]: transport.TSaslTransport (TSaslTransport.java:flush(498)) - writing data length: 1127