You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Aravindan Vijayan (JIRA)" <ji...@apache.org> on 2016/02/09 02:06:18 UTC

[jira] [Updated] (AMBARI-14964) AMS cannot be installed on trunk

     [ https://issues.apache.org/jira/browse/AMBARI-14964?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aravindan Vijayan updated AMBARI-14964:
---------------------------------------
    Due Date: 9/Feb/16

> AMS cannot be installed on trunk
> --------------------------------
>
>                 Key: AMBARI-14964
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14964
>             Project: Ambari
>          Issue Type: Bug
>    Affects Versions: 2.4.0
>            Reporter: Aravindan Vijayan
>            Assignee: Aravindan Vijayan
>            Priority: Blocker
>             Fix For: 2.4.0
>
>
> Installation of AMS fails due to the following:
> {code}
> stderr: 
> Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_grafana.py", line 65, in <module>
>     AmsGrafana().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 238, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_grafana.py", line 28, in install
>     self.install_packages(env, exclude_packages = ['ambari-metrics-collector'])
> TypeError: install_packages() got an unexpected keyword argument 'exclude_packages'
>  stdout:
> 2016-02-08 23:55:57,518 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
> 2016-02-08 23:55:57,519 - Group['hadoop'] {}
> 2016-02-08 23:55:57,520 - Group['users'] {}
> 2016-02-08 23:55:57,520 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,520 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,521 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,521 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2016-02-08 23:55:57,522 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2016-02-08 23:55:57,522 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,523 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,523 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,524 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,524 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2016-02-08 23:55:57,525 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2016-02-08 23:55:57,526 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2016-02-08 23:55:57,529 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
> 2016-02-08 23:55:57,530 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
> 2016-02-08 23:55:57,530 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2016-02-08 23:55:57,531 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
> 2016-02-08 23:55:57,534 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
> 2016-02-08 23:55:57,534 - Group['hdfs'] {}
> 2016-02-08 23:55:57,534 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
> 2016-02-08 23:55:57,535 - FS Type: 
> 2016-02-08 23:55:57,535 - Directory['/etc/hadoop'] {'mode': 0755}
> 2016-02-08 23:55:57,548 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
> 2016-02-08 23:55:57,548 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
> 2016-02-08 23:55:57,557 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu12/2.x/updates/2.3.4.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP', 'mirror_list': None}
> 2016-02-08 23:55:57,562 - File['/tmp/tmpniL5ny'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu12/2.x/updates/2.3.4.0 HDP main'}
> 2016-02-08 23:55:57,562 - Writing File['/tmp/tmpniL5ny'] because contents don't match
> 2016-02-08 23:55:57,563 - File['/tmp/tmpswEIS6'] {'content': StaticFile('/etc/apt/sources.list.d/HDP.list')}
> 2016-02-08 23:55:57,563 - Writing File['/tmp/tmpswEIS6'] because contents don't match
> 2016-02-08 23:55:57,564 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu12', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
> 2016-02-08 23:55:57,565 - File['/tmp/tmpEsfP2D'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ubuntu12 HDP-UTILS main'}
> 2016-02-08 23:55:57,565 - Writing File['/tmp/tmpEsfP2D'] because contents don't match
> 2016-02-08 23:55:57,566 - File['/tmp/tmptJGgBL'] {'content': StaticFile('/etc/apt/sources.list.d/HDP-UTILS.list')}
> 2016-02-08 23:55:57,566 - Writing File['/tmp/tmptJGgBL'] because contents don't match
> 2016-02-08 23:55:57,567 - Package['unzip'] {}
> 2016-02-08 23:55:57,581 - Skipping installation of existing package unzip
> 2016-02-08 23:55:57,581 - Package['curl'] {}
> 2016-02-08 23:55:57,595 - Skipping installation of existing package curl
> 2016-02-08 23:55:57,595 - Package['hdp-select'] {}
> 2016-02-08 23:55:57,608 - Skipping installation of existing package hdp-select
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)