You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by "Andrew Purtell (JIRA)" <ji...@apache.org> on 2015/04/11 03:29:13 UTC

[jira] [Resolved] (HBASE-5899) Local cluster tries to connect to HDFS which makes the startup failed

     [ https://issues.apache.org/jira/browse/HBASE-5899?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Purtell resolved HBASE-5899.
-----------------------------------
    Resolution: Invalid

> Local cluster tries to connect to HDFS which makes the startup failed
> ---------------------------------------------------------------------
>
>                 Key: HBASE-5899
>                 URL: https://issues.apache.org/jira/browse/HBASE-5899
>             Project: HBase
>          Issue Type: Bug
>    Affects Versions: 0.92.0, 0.92.1
>         Environment: Mac OS X Lion
>            Reporter: Yifeng Jiang
>            Priority: Minor
>
> In 0.92, HBase local cluster won't start because of trying to connect to local HDFS. This error does not happen in 0.90.
> We should not need to connect to HDFS to run a local cluster.
> Here is my hbase-site.xml
> {code:xml}
> <configuration>
>   <property>
>     <name>hbase.rootdir</name>
>     <value>file:///usr/local/hbase/var/hbase</value>
>   </property>
> </configuration>
> {code}
> This is the error:
> {noformat}
> 2012-04-30 11:32:22,225 ERROR org.apache.hadoop.hbase.master.HMasterCommandLine: Failed to start master
> java.net.ConnectException: Call to localhost/127.0.0.1:8020 failed on connection exception: java.net.ConnectException: Connection refused
>     at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>     at $Proxy11.getProtocolVersion(Unknown Source)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>     at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>     at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>     at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>     at org.apache.hadoop.hbase.util.JVMClusterUtil.startup(JVMClusterUtil.java:185)
>     at org.apache.hadoop.hbase.LocalHBaseCluster.startup(LocalHBaseCluster.java:418)
>     at org.apache.hadoop.hbase.master.HMasterCommandLine.startMaster(HMasterCommandLine.java:141)
>     at org.apache.hadoop.hbase.master.HMasterCommandLine.run(HMasterCommandLine.java:103)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>     at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:76)
>     at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1637)
> Caused by: java.net.ConnectException: Connection refused
>     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
>     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
>     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
>     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
>     at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
>     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1046)
>     ... 20 more
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)