You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Ayache Khettar <ay...@googlemail.com> on 2014/09/01 14:05:18 UTC

Hbase 0.98 not able to connect to Hadoop 2.4 running on VM

Hi

I have installed a hadoop 2.4 cluster on a virtual machine and everything
is up and running. Here is my settings in core-site.xml

<configuration>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:9000</value>
  </property>
</configuration>

Hbase settings:

<configuration>

 <property>
   <name>hbase.rootdir</name>
   <value>hdfs://hadoop:54310/hbase</value>
 </property>

 <property>
    <name>hbase.security.authentication</name>
    <value>simple</value>
 </property>
 <property>
 <name>hbase.cluster.distributed</name>
 <value>true</value>
 </property>
 <property>
 <name>hbase.zookeeper.property.clientPort</name>
 <value>2181</value>
 </property>
 <property>
    <name>dfs.replication</name>
    <value>1</value>
 </property>
 <property>
 <name>hbase.zookeeper.quorum</name>
 <value>akhettar</value>
 </property>
 <property>
 <name>hbase.security.authentication</name>
 <value>simple</value>
 </property>
 <property>
 <name>hbase.security.authorization</name>
 <value>true</value>
 </property>
 <property>
 <name>hbase.coprocessor.master.classes</name>
 <value>org.apache.hadoop.hbase.security.access.AccessController</value>
 </property>
 <property>
 <name>hbase.coprocessor.region.classes</name>
 <value>org.apache.hadoop.hbase.security.access.AccessController</value>
 </property>
</configuration>

I am running Hbase 0.98 version on a Macbook which is hosting the Hadoop
VM. When I try to start hbase master I get the error below. When i try to
telnet from my macbook onto the VM: telnet hadoop 9000 I get connection
refused too. I can telnet  to port 9000 from the VM on which hadoop is
running through.

So my guess is that it's nothing to do with hbase configuration. The port
is closed to outside communication. I can telnet though from my machine to
the VM on other ports like 50070.

Any ideas?

Thanks

Ayache



Hbase logs

014-09-01 12:30:32,186 FATAL [master:localhost:60000] master.HMaster:
Unhandled exception. Starting shutdown.
java.net.ConnectException: Call From akhettar/127.0.0.1 to localhost:9000
failed on connection exception: java.net.ConnectException: Connection
refused; For more details see:
http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
    at org.apache.hadoop.ipc.Client.call(Client.java:1351)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
    at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
    at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:561)
    at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2146)
    at
org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:983)
    at
org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:967)
    at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:446)
    at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:896)
    at
org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:441)
    at
org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:152)
    at
org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128)
    at
org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:790)
    at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:603)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
    at
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:547)
    at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:642)
    at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
    at org.apache.hadoop.ipc.Client.call(Client.java:1318)
    ... 22 more
2014-09-01 12:30:32,187 INFO  [master:localhost:60000] master.HMaster:
Aborting
2014-09-01 12:30:32,188 DEBUG [master:localhost:60000] master.HMaster:
Stopping service threads
2014-09-01 12:30:32,188 INFO  [master:localhost:60000] ipc.RpcServer:
Stopping server on 60000
2014-09-01 12:30:32,188 INFO  [RpcServer.listener,port=60000]
ipc.RpcServer: RpcServer.listener,port=60000: stopping
2014-09-01 12:30:32,188 INFO  [master:localhost:60000] master.HMaster:
Stopping infoServer
2014-09-01 12:30:32,188 INFO  [RpcServer.responder] ipc.RpcServer:
RpcServer.responder: stopped
2014-09-01 12:30:32,188 INFO  [RpcServer.responder] ipc.RpcServer:
RpcServer.responder: stopping
2014-09-01 12:30:32,191 INFO  [master:localhost:60000] mortbay.log: Stopped
SelectChannelConnector@0.0.0.0:60010
2014-09-01 12:30:32,302 INFO  [master:localhost:60000] zookeeper.ZooKeeper:
Session: 0x1483077a7890008 closed
2014-09-01 12:30:32,303 INFO  [master:localhost:60000] master.HMaster:
HMaster main thread exiting
2014-09-01 12:30:32,303 INFO  [main-EventThread] zookeeper.ClientCnxn:
EventThread shut down
2014-09-01 12:30:32,303 ERROR [main] master.HMasterCommandLine: Master
exiting
java.lang.RuntimeException: HMaster Aborted

Re: Hbase 0.98 not able to connect to Hadoop 2.4 running on VM

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Can you try to change

 <property>
   <name>hbase.rootdir</name>
   <value>hdfs://hadoop:54310/hbase</value>
  </property>

With

 <property>
   <name>hbase.rootdir</name>
   <value>hdfs://localhost:9000/hbase</value>
  </property>


? Why have you put 54310?




2014-09-01 8:05 GMT-04:00 Ayache Khettar <ay...@googlemail.com>:

> Hi
>
> I have installed a hadoop 2.4 cluster on a virtual machine and everything
> is up and running. Here is my settings in core-site.xml
>
> <configuration>
>   <property>
>     <name>fs.default.name</name>
>     <value>hdfs://localhost:9000</value>
>   </property>
> </configuration>
>
> Hbase settings:
>
> <configuration>
>
>  <property>
>    <name>hbase.rootdir</name>
>    <value>hdfs://hadoop:54310/hbase</value>
>  </property>
>
>  <property>
>     <name>hbase.security.authentication</name>
>     <value>simple</value>
>  </property>
>  <property>
>  <name>hbase.cluster.distributed</name>
>  <value>true</value>
>  </property>
>  <property>
>  <name>hbase.zookeeper.property.clientPort</name>
>  <value>2181</value>
>  </property>
>  <property>
>     <name>dfs.replication</name>
>     <value>1</value>
>  </property>
>  <property>
>  <name>hbase.zookeeper.quorum</name>
>  <value>akhettar</value>
>  </property>
>  <property>
>  <name>hbase.security.authentication</name>
>  <value>simple</value>
>  </property>
>  <property>
>  <name>hbase.security.authorization</name>
>  <value>true</value>
>  </property>
>  <property>
>  <name>hbase.coprocessor.master.classes</name>
>  <value>org.apache.hadoop.hbase.security.access.AccessController</value>
>  </property>
>  <property>
>  <name>hbase.coprocessor.region.classes</name>
>  <value>org.apache.hadoop.hbase.security.access.AccessController</value>
>  </property>
> </configuration>
>
> I am running Hbase 0.98 version on a Macbook which is hosting the Hadoop
> VM. When I try to start hbase master I get the error below. When i try to
> telnet from my macbook onto the VM: telnet hadoop 9000 I get connection
> refused too. I can telnet  to port 9000 from the VM on which hadoop is
> running through.
>
> So my guess is that it's nothing to do with hbase configuration. The port
> is closed to outside communication. I can telnet though from my machine to
> the VM on other ports like 50070.
>
> Any ideas?
>
> Thanks
>
> Ayache
>
>
>
> Hbase logs
>
> 014-09-01 12:30:32,186 FATAL [master:localhost:60000] master.HMaster:
> Unhandled exception. Starting shutdown.
> java.net.ConnectException: Call From akhettar/127.0.0.1 to localhost:9000
> failed on connection exception: java.net.ConnectException: Connection
> refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>     at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>     at
>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>     at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>     at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:606)
>     at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>     at
>
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>     at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
>     at
>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:561)
>     at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2146)
>     at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:983)
>     at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:967)
>     at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:446)
>     at
> org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:896)
>     at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:441)
>     at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:152)
>     at
>
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128)
>     at
>
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:790)
>     at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:603)
>     at java.lang.Thread.run(Thread.java:745)
> Caused by: java.net.ConnectException: Connection refused
>     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>     at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>     at
>
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
>     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
>     at
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:547)
>     at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:642)
>     at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
>     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1318)
>     ... 22 more
> 2014-09-01 12:30:32,187 INFO  [master:localhost:60000] master.HMaster:
> Aborting
> 2014-09-01 12:30:32,188 DEBUG [master:localhost:60000] master.HMaster:
> Stopping service threads
> 2014-09-01 12:30:32,188 INFO  [master:localhost:60000] ipc.RpcServer:
> Stopping server on 60000
> 2014-09-01 12:30:32,188 INFO  [RpcServer.listener,port=60000]
> ipc.RpcServer: RpcServer.listener,port=60000: stopping
> 2014-09-01 12:30:32,188 INFO  [master:localhost:60000] master.HMaster:
> Stopping infoServer
> 2014-09-01 12:30:32,188 INFO  [RpcServer.responder] ipc.RpcServer:
> RpcServer.responder: stopped
> 2014-09-01 12:30:32,188 INFO  [RpcServer.responder] ipc.RpcServer:
> RpcServer.responder: stopping
> 2014-09-01 12:30:32,191 INFO  [master:localhost:60000] mortbay.log: Stopped
> SelectChannelConnector@0.0.0.0:60010
> 2014-09-01 12:30:32,302 INFO  [master:localhost:60000] zookeeper.ZooKeeper:
> Session: 0x1483077a7890008 closed
> 2014-09-01 12:30:32,303 INFO  [master:localhost:60000] master.HMaster:
> HMaster main thread exiting
> 2014-09-01 12:30:32,303 INFO  [main-EventThread] zookeeper.ClientCnxn:
> EventThread shut down
> 2014-09-01 12:30:32,303 ERROR [main] master.HMasterCommandLine: Master
> exiting
> java.lang.RuntimeException: HMaster Aborted
>