You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Rudy (JIRA)" <ji...@apache.org> on 2015/10/25 22:12:27 UTC

[jira] [Commented] (AMBARI-11990) Deployment fails on snappy with HDP 2.3 on CentOS7

    [ https://issues.apache.org/jira/browse/AMBARI-11990?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14973439#comment-14973439 ] 

Rudy commented on AMBARI-11990:
-------------------------------

The problem is still present when installing `snappy-devel`. It uses the version from the EL6 HDP-UTILS repo.

{code}
stderr: 
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 153, in <module>
    DataNode().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py", line 34, in install
    self.install_packages(env, params.exclude_packages)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 395, in install_packages
    Package(name)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 45, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
    shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install snappy-devel' returned 1. Error: Package: snappy-devel-1.0.5-1.el6.x86_64 (HDP-UTILS-1.1.0.20)
           Requires: snappy(x86-64) = 1.0.5-1.el6
           Installed: snappy-1.1.0-3.el7.x86_64 (@anaconda/7.1)
               snappy(x86-64) = 1.1.0-3.el7
           Available: snappy-1.0.5-1.el6.x86_64 (HDP-UTILS-1.1.0.20)
               snappy(x86-64) = 1.0.5-1.el6
 You could try using --skip-broken to work around the problem
 You could try running: rpm -Va --nofiles --nodigest
 stdout:
2015-10-25 16:57:11,828 - Group['spark'] {}
2015-10-25 16:57:11,829 - Group['hadoop'] {}
2015-10-25 16:57:11,829 - Group['users'] {}
2015-10-25 16:57:11,829 - User['hive'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,830 - User['storm'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,830 - User['zookeeper'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,831 - User['atlas'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,831 - User['ams'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,831 - User['tez'] {'gid': 'hadoop', 'groups': [u'users']}
2015-10-25 16:57:11,832 - User['mahout'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,832 - User['spark'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,833 - User['ambari-qa'] {'gid': 'hadoop', 'groups': [u'users']}
2015-10-25 16:57:11,833 - User['hdfs'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,834 - User['yarn'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,834 - User['mapred'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,835 - User['hbase'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,835 - User['hcat'] {'gid': 'hadoop', 'groups': [u'hadoop']}
2015-10-25 16:57:11,836 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-10-25 16:57:11,837 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2015-10-25 16:57:11,842 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2015-10-25 16:57:11,842 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2015-10-25 16:57:11,843 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-10-25 16:57:11,844 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2015-10-25 16:57:11,848 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2015-10-25 16:57:11,849 - Group['hdfs'] {'ignore_failures': False}
2015-10-25 16:57:11,849 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', u'hdfs']}
2015-10-25 16:57:11,849 - Directory['/etc/hadoop'] {'mode': 0755}
2015-10-25 16:57:11,860 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2015-10-25 16:57:11,861 - Writing File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] because contents don't match
2015-10-25 16:57:11,913 - Changing owner for /usr/hdp/current/hadoop-client/conf/hadoop-env.sh from 0 to hdfs
2015-10-25 16:57:11,913 - Changing group for /usr/hdp/current/hadoop-client/conf/hadoop-env.sh from 0 to hadoop
2015-10-25 16:57:11,914 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2015-10-25 16:57:11,923 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.3.2.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2015-10-25 16:57:11,929 - File['/etc/yum.repos.d/HDP.repo'] {'content': InlineTemplate(...)}
2015-10-25 16:57:11,930 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2015-10-25 16:57:11,932 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': InlineTemplate(...)}
2015-10-25 16:57:11,933 - Package['unzip'] {}
2015-10-25 16:57:12,030 - Skipping installation of existing package unzip
2015-10-25 16:57:12,031 - Package['curl'] {}
2015-10-25 16:57:12,065 - Skipping installation of existing package curl
2015-10-25 16:57:12,065 - Package['hdp-select'] {}
2015-10-25 16:57:12,098 - Skipping installation of existing package hdp-select
2015-10-25 16:57:12,742 - Package['rpcbind'] {}
2015-10-25 16:57:12,846 - Installing package rpcbind ('/usr/bin/yum -d 0 -e 0 -y install rpcbind')
2015-10-25 16:57:14,349 - Package['hadoop_2_3_*'] {}
2015-10-25 16:57:14,383 - Skipping installation of existing package hadoop_2_3_*
2015-10-25 16:57:14,383 - Package['snappy'] {}
2015-10-25 16:57:14,415 - Skipping installation of existing package snappy
2015-10-25 16:57:14,416 - Package['snappy-devel'] {}
2015-10-25 16:57:14,448 - Installing package snappy-devel ('/usr/bin/yum -d 0 -e 0 -y install snappy-devel')
{code}

> Deployment fails on snappy with HDP 2.3 on CentOS7
> --------------------------------------------------
>
>                 Key: AMBARI-11990
>                 URL: https://issues.apache.org/jira/browse/AMBARI-11990
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.1.0
>            Reporter: Jayush Luniya
>            Assignee: Jayush Luniya
>            Priority: Blocker
>             Fix For: 2.1.0
>
>         Attachments: AMBARI-11990.patch
>
>
> For centos7 + HDP 2.3 ambari is using the centos6 HDP-UTILS. See here:
> https://github.com/apache/ambari/blob/trunk/ambari-server/src/main/resources/stacks/HDP/2.3/repos/repoinfo.xml 
> Looks like that is an issue.
> {code}
> [root@ip-172-30-0-151 yum.repos.d]# more HDP-UTILS.repo 
> [HDP-UTILS-1.1.0.20]
> name=HDP-UTILS-1.1.0.20
> baseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos6
> path=/
> enabled=1
> gpgcheck=0
> [root@ip-172-30-0-151 yum.repos.d]# /usr/bin/yum -y install snappy-devel
> Loaded plugins: amazon-id, rhui-lb
> Resolving Dependencies
> --> Running transaction check
> ---> Package snappy-devel.x86_64 0:1.0.5-1.el6 will be installed
> --> Processing Dependency: snappy(x86-64) = 1.0.5-1.el6 for package: snappy-devel-1.0.5-1.el6.x86_64
> --> Finished Dependency Resolution
> Error: Package: snappy-devel-1.0.5-1.el6.x86_64 (HDP-UTILS-1.1.0.20)
>            Requires: snappy(x86-64) = 1.0.5-1.el6
>            Installed: snappy-1.1.0-3.el7.x86_64 (@anaconda/7.1)
>                snappy(x86-64) = 1.1.0-3.el7
>            Available: snappy-1.0.5-1.el6.x86_64 (HDP-UTILS-1.1.0.20)
>                snappy(x86-64) = 1.0.5-1.el6
>  You could try using --skip-broken to work around the problem
>  You could try running: rpm -Va --nofiles --nodigest
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)