You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Dmitry Lysnichenko (JIRA)" <ji...@apache.org> on 2015/11/27 15:33:10 UTC

[jira] [Created] (AMBARI-14100) RU and EU upgrade failed on first step

Dmitry Lysnichenko created AMBARI-14100:
-------------------------------------------

             Summary: RU and EU upgrade failed on first step
                 Key: AMBARI-14100
                 URL: https://issues.apache.org/jira/browse/AMBARI-14100
             Project: Ambari
          Issue Type: Bug
            Reporter: Dmitry Lysnichenko
            Assignee: Dmitry Lysnichenko
         Attachments: AMBARI-14100.patch


Rolling Upgrade and Express Upgrade failed on first step.

{code}
2015-11-26 16:59:09,618 - Task. Type: EXECUTE, Script: scripts/namenode.py - Function: prepare_rolling_upgrade
2015-11-26 16:59:09,995 - call['conf-select create-conf-dir --package hadoop --stack-version 2.3.4.0-3335 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2015-11-26 16:59:10,057 - call returned (1, '/etc/hadoop/2.3.4.0-3335/0 exist already', '')
2015-11-26 16:59:10,057 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.3.4.0-3335 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2015-11-26 16:59:10,114 - checked_call returned (0, '/usr/hdp/2.3.4.0-3335/hadoop/conf -> /etc/hadoop/2.3.4.0-3335/0')
2015-11-26 16:59:10,115 - For package : hadoop, DIRS = [{'current_dir': '/usr/hdp/current/hadoop-client/conf', 'conf_dir': '/etc/hadoop/conf'}]
2015-11-26 16:59:10,115 - For package : hadoop, Source dir: /etc/hadoop/conf, Dest dir: /usr/hdp/current/hadoop-client/conf
2015-11-26 16:59:10,115 - Normalized Conf Dir : /etc/hadoop/conf, Normalized Current Dir : /etc/hadoop/2.3.0.0-2557/0
2015-11-26 16:59:10,115 - /etc/hadoop/conf exists and points to incorrect path /usr/hdp/current/hadoop-client/conf
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 432, in <module>
NameNode().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 217, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 174, in prepare_rolling_upgrade
hfds_binary = self.get_hdfs_binary()
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 77, in get_hdfs_binary
import params
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/params.py", line 25, in <module>
from params_linux import *
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py", line 20, in <module>
import status_params
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/status_params.py", line 53, in <module>
hadoop_conf_dir = conf_select.get_hadoop_conf_dir()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 316, in get_hadoop_conf_dir
select(stack_name, "hadoop", stack_version)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/conf_select.py", line 270, in select
os.remove(normalized_conf_dir)
OSError: [Errno 13] Permission denied: '/etc/hadoop/conf'

Error: [Errno 13] Permission denied: '/etc/hadoop/conf'
{code}

The cause of the bug is that code for AMBARI-14052  does not work with non-root agent (exactly, os.remove() / os.symlink() calls should be replaced by RMF framework usage)




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)