You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Siddharth Wagle (JIRA)" <ji...@apache.org> on 2017/10/17 00:59:01 UTC

[jira] [Resolved] (AMBARI-22248) HBase default.rootdir config results in deploy failure if value is not overriden

     [ https://issues.apache.org/jira/browse/AMBARI-22248?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Siddharth Wagle resolved AMBARI-22248.
--------------------------------------
    Resolution: Fixed

> HBase default.rootdir config results in deploy failure if value is not overriden
> --------------------------------------------------------------------------------
>
>                 Key: AMBARI-22248
>                 URL: https://issues.apache.org/jira/browse/AMBARI-22248
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.0.0
>            Reporter: Siddharth Wagle
>            Assignee: Siddharth Wagle
>            Priority: Critical
>             Fix For: 2.6.1
>
>         Attachments: AMBARI-22248.patch
>
>
> The default value for hbase.rootdir is set as below, due to which hbase fails to start with below exception.
> hbase.rootdir=hdfs://localhost:8020/apps/hbase/data
> {noformat}
> 2017-10-16 17:23:06,761 FATAL [ctr-e134-1499953498516-228160-01-000003:16000.activeMasterManager] master.HMaster: Unhandled exception. Starting shutdown.
> java.net.ConnectException: Call From ctr-e134-1499953498516-228160-01-000003.hwx.site/172.27.63.128 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:801)
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
>         at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1558)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1498)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1398)
>         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
>         at com.sun.proxy.$Proxy16.setSafeMode(Unknown Source)
>         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:718)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)
>         at com.sun.proxy.$Proxy17.setSafeMode(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:280)
>         at com.sun.proxy.$Proxy18.setSafeMode(Unknown Source)
>         at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2669)
>         at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:1359)
>         at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:1343)
>         at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:555)
>         at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:1001)
>         at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:455)
>         at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:162)
>         at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:142)
> {noformat}
> Since Ambari only allows HBase to be deployed along with HDFS through the dependency in common-services definition of HBase, we can set the root directory as a relative path while the hadoop-common code figures out the FS URI on its own.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)