You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "stack (JIRA)" <ji...@apache.org> on 2016/04/20 18:23:25 UTC

[jira] [Commented] (HBASE-15681) Connect to a remote Hbase from Java client (pentaho) gives org.apache.hadoop.hbase.client.RetriesExhaustedException

    [ https://issues.apache.org/jira/browse/HBASE-15681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15250183#comment-15250183 ] 

stack commented on HBASE-15681:
-------------------------------

This is more suited to the user mailing list. Please ask your question there rather than here.

> Connect to a remote Hbase from Java client (pentaho)  gives org.apache.hadoop.hbase.client.RetriesExhaustedException
> --------------------------------------------------------------------------------------------------------------------
>
>                 Key: HBASE-15681
>                 URL: https://issues.apache.org/jira/browse/HBASE-15681
>             Project: HBase
>          Issue Type: Bug
>          Components: hadoop2, hbase
>         Environment: hbase 1.0.0 , hadoop 2.6.4
>            Reporter: Mustapha
>
> Unable to read remote HBase tables from a local java Client due to a timeOut error. Seeing the following error:
> java.io.IOException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
> Wed Apr 20 10:32:43 WEST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=75181: row 'pentaho_mappings,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=localhost,16020,1461071963695, seqNum=0
> 	at com.pentaho.big.data.bundles.impl.shim.hbase.table.HBaseTableImpl.exists(HBaseTableImpl.java:71)
> 	at org.pentaho.big.data.kettle.plugins.hbase.mapping.MappingAdmin.getMappedTables(MappingAdmin.java:502)
> 	at org.pentaho.big.data.kettle.plugins.hbase.output.HBaseOutputDialog.setupMappedTableNames(HBaseOutputDialog.java:818)
> 	at org.pentaho.big.data.kettle.plugins.hbase.output.HBaseOutputDialog.access$900(HBaseOutputDialog.java:88)
> 	at org.pentaho.big.data.kettle.plugins.hbase.output.HBaseOutputDialog$7.widgetSelected(HBaseOutputDialog.java:398)
> 	at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
> 	at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
> 	at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
> 	at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
> 	at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
> 	at org.pentaho.big.data.kettle.plugins.hbase.output.HBaseOutputDialog.open(HBaseOutputDialog.java:603)
> 	at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:125)
> 	at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8783)
> 	at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3072)
> 	at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:755)
> 	at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
> 	at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
> 	at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
> 	at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
> 	at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
> 	at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1347)
> 	at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7989)
> 	at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9269)
> 	at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:662)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
> 	at java.lang.reflect.Method.invoke(Unknown Source)
> 	at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
> Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
> Wed Apr 20 10:32:43 WEST 2016, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=75181: row 'pentaho_mappings,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=localhost,16020,1461071963695, seqNum=0
> 	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:270)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:225)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:63)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
> 	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:314)
> 	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:289)
> 	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:161)
> 	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:156)
> 	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:888)
> 	at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
> 	at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365)
> 	at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:310)
> 	at org.pentaho.hadoop.hbase.factory.HBase10Admin.tableExists(HBase10Admin.java:41)
> 	at org.pentaho.hbase.shim.common.CommonHBaseConnection.tableExists(CommonHBaseConnection.java:206)
> 	at org.pentaho.hbase.shim.common.HBaseConnectionImpl.access$801(HBaseConnectionImpl.java:35)
> 	at org.pentaho.hbase.shim.common.HBaseConnectionImpl$9.call(HBaseConnectionImpl.java:185)
> 	at org.pentaho.hbase.shim.common.HBaseConnectionImpl$9.call(HBaseConnectionImpl.java:181)
> 	at org.pentaho.hbase.shim.common.HBaseConnectionImpl.doWithContextClassLoader(HBaseConnectionImpl.java:76)
> 	at org.pentaho.hbase.shim.common.HBaseConnectionImpl.tableExists(HBaseConnectionImpl.java:181)
> 	at com.pentaho.big.data.bundles.impl.shim.hbase.HBaseConnectionWrapper.tableExists(HBaseConnectionWrapper.java:72)
> 	at com.pentaho.big.data.bundles.impl.shim.hbase.table.HBaseTableImpl.exists(HBaseTableImpl.java:69)
> 	... 28 more
> Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=75181: row 'pentaho_mappings,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=localhost,16020,1461071963695, seqNum=0
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:159)
> 	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:64)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
> 	at java.lang.Thread.run(Unknown Source)
> Caused by: java.net.ConnectException: Connection refused: no further information
> 	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> 	at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
> 	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
> 	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:404)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:710)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:890)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:859)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1193)
> 	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:216)
> 	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:300)
> 	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:32651)
> 	at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:372)
> 	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
> 	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:371)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:345)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
> 	... 4 more
> hbase-site.xml :
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> <!--
> /**
>  *
>  * Licensed to the Apache Software Foundation (ASF) under one
>  * or more contributor license agreements.  See the NOTICE file
>  * distributed with this work for additional information
>  * regarding copyright ownership.  The ASF licenses this file
>  * to you under the Apache License, Version 2.0 (the
>  * "License"); you may not use this file except in compliance
>  * with the License.  You may obtain a copy of the License at
>  *
>  *     http://www.apache.org/licenses/LICENSE-2.0
>  *
>  * Unless required by applicable law or agreed to in writing, software
>  * distributed under the License is distributed on an "AS IS" BASIS,
>  * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
>  * See the License for the specific language governing permissions and
>  * limitations under the License.
>  */
> -->
> <configuration>
> <property>
>    <name>hbase.cluster.distributed</name>
>    <value>true</value>
> </property>
> <property>
>    <name>hbase.rootdir</name>
>    <value>hdfs://master-sigma:54310/hbase</value>
> </property>
> <property>
>     <name>hbase.zookeeper.quorum</name>
>     <value>master-sigma</value>
> </property>
> <property>
>   <name>hbase.master.ipc.address</name>
>   <value>0.0.0.0</value>
> </property>
> <property>
>   <name>hbase.regionserver.ipc.address</name>
>   <value>0.0.0.0</value>
> </property>
> <property>
>   <name>hbase.master</name>
>   <value>master-sigma:16000</value>
> </property>
> </configuration>
> Your help please
> thanks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)