You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Hudson (JIRA)" <ji...@apache.org> on 2016/02/24 20:54:19 UTC

[jira] [Commented] (AMBARI-14596) Install cluster failed as tried to write config when hadoop conf dir is missing

    [ https://issues.apache.org/jira/browse/AMBARI-14596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15163677#comment-15163677 ] 

Hudson commented on AMBARI-14596:
---------------------------------

ABORTED: Integrated in Ambari-trunk-Commit #4382 (See [https://builds.apache.org/job/Ambari-trunk-Commit/4382/])
AMBARI-14596. Install cluster failed on Accumulo as tried to write (ncole: [http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=f3938a99613fd86a3476fca26dbf030189ffd3f7])
* ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py


> Install cluster failed as tried to write config when hadoop conf dir is missing
> -------------------------------------------------------------------------------
>
>                 Key: AMBARI-14596
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14596
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.2.0
>            Reporter: Alejandro Fernandez
>            Assignee: Alejandro Fernandez
>             Fix For: 2.2.1
>
>         Attachments: AMBARI-14596.trunk.patch
>
>
> Cluster installation failed on Accumulo Client because it was one of the first tasks scheduled and HDFS Client had not been installed yet, which installs the hadoop rpm and creates the /etc/hadoop/conf folder.
> If a host does not contain /etc/hadoop/conf, then we should not attempt to write config files to it during the after-install hooks. Once a component is installed that does contain the hadoop rpm, then it will be responsible for writing out the configs to it.
> Ambari 2.2.1.0-71
> HDP 2.4.0.0-47
> {code}Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 38, in <module>
>     AfterInstallHook().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py", line 33, in hook
>     setup_config()
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py", line 55, in setup_config
>     only_if=format("ls {hadoop_conf_dir}"))
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
>     encoding = self.resource.encoding
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
>     raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
> resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)