You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@metron.apache.org by "Justin Leet (JIRA)" <ji...@apache.org> on 2019/04/24 02:10:00 UTC

[jira] [Updated] (METRON-2080) unable to access HDP 2.5.0 repo : http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.5.0.0

     [ https://issues.apache.org/jira/browse/METRON-2080?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Justin Leet updated METRON-2080:
--------------------------------
    Fix Version/s:     (was: 0.4.1)

> unable to access HDP 2.5.0 repo  : http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.5.0.0
> -------------------------------------------------------------------------------------------------------
>
>                 Key: METRON-2080
>                 URL: https://issues.apache.org/jira/browse/METRON-2080
>             Project: Metron
>          Issue Type: Bug
>    Affects Versions: 0.4.1
>            Reporter: Abhishek Sinha
>            Priority: Major
>
> HI, 
> I am trying to deploy HDP 2.5.0 with Metron, getting following error : 
>  
> 2019-04-16 15:39:58,855 - Could not find stack selector for stack: HDP
> Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 155, in <module>
>     ApplicationTimelineServer().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 285, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/application_timeline_server.py", line 39, in install
>     self.install_packages(env)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 576, in install_packages
>     retry_count=agent_stack_retry_count)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
>     self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 51, in install_package
>     self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 86, in checked_call_with_retries
>     return self._call_with_retries(cmd, is_checked=True, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 98, in _call_with_retries
>     code, out = func(cmd, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
>     result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
>     tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
>     result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 293, in _call
>     raise ExecutionFailed(err_msg, code, out, err)
> resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_5_0_0_1245-yarn' returned 1. Error: Nothing to do
> *stdout:*   */var/lib/ambari-agent/data/output-185.txt*
> 2019-04-16 15:38:35,068 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
> 2019-04-16 15:38:35,070 - Group['hadoop'] {}
> 2019-04-16 15:38:35,071 - Group['users'] {}
> 2019-04-16 15:38:35,071 - Group['zeppelin'] {}
> 2019-04-16 15:38:35,071 - Group['metron'] {}
> 2019-04-16 15:38:35,072 - Group['kibana'] {}
> 2019-04-16 15:38:35,072 - Group['spark'] {}
> 2019-04-16 15:38:35,072 - Group['livy'] {}
> 2019-04-16 15:38:35,072 - Group['elasticsearch'] {}
> 2019-04-16 15:38:35,072 - User['hive'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,073 - User['zeppelin'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,074 - User['ambari-qa'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2019-04-16 15:38:35,074 - User['flume'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,075 - User['metron'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,076 - User['hdfs'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,076 - User['storm'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,077 - User['spark'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,077 - User['kibana'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,078 - User['livy'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,079 - User['mapred'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,079 - User['hbase'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,080 - User['tez'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
> 2019-04-16 15:38:35,080 - User['zookeeper'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,081 - User['kafka'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,082 - User['yarn'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,082 - User['hcat'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,083 - User['ams'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,083 - User['elasticsearch'] \{'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
> 2019-04-16 15:38:35,084 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] \{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2019-04-16 15:38:35,086 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] \{'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2019-04-16 15:38:35,105 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
> 2019-04-16 15:38:35,106 - Directory['/tmp/hbase-hbase'] \{'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
> 2019-04-16 15:38:35,107 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] \{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2019-04-16 15:38:35,109 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] \{'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
> 2019-04-16 15:38:35,129 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
> 2019-04-16 15:38:35,129 - Group['hdfs'] {}
> 2019-04-16 15:38:35,130 - User['hdfs'] \{'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
> 2019-04-16 15:38:35,131 - FS Type:
> 2019-04-16 15:38:35,131 - Directory['/etc/hadoop'] \{'mode': 0755}
> 2019-04-16 15:38:35,132 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] \{'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
> 2019-04-16 15:38:35,153 - Initializing 6 repositories
> 2019-04-16 15:38:35,155 - Repository['HDP-2.5'] \{'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.5.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname=\{{repo_id}}\n\{% if mirror_list %}mirrorlist=\{{mirror_list}}\{% else %}baseurl=\{{base_url}}\{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
> 2019-04-16 15:38:35,165 - File['/etc/yum.repos.d/HDP.repo'] \{'content': '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.5.3.0\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2019-04-16 15:38:35,165 - Repository['HDP-UTILS-1.1.0.21'] \{'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname=\{{repo_id}}\n\{% if mirror_list %}mirrorlist=\{{mirror_list}}\{% else %}baseurl=\{{base_url}}\{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
> 2019-04-16 15:38:35,169 - File['/etc/yum.repos.d/HDP-UTILS.repo'] \{'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2019-04-16 15:38:35,169 - Repository['ES-Curator-4.x'] \{'base_url': 'http://packages.elastic.co/curator/4/centos/7', 'action': ['create'], 'components': [u'CURATOR', 'main'], 'repo_template': '[{{repo_id}}]\nname=\{{repo_id}}\n\{% if mirror_list %}mirrorlist=\{{mirror_list}}\{% else %}baseurl=\{{base_url}}\{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'CURATOR', 'mirror_list': None}
> 2019-04-16 15:38:35,172 - File['/etc/yum.repos.d/CURATOR.repo'] \{'content': '[ES-Curator-4.x]\nname=ES-Curator-4.x\nbaseurl=http://packages.elastic.co/curator/4/centos/7\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2019-04-16 15:38:35,172 - Repository['elasticsearch-2.x'] \{'base_url': 'https://packages.elastic.co/elasticsearch/2.x/centos', 'action': ['create'], 'components': [u'ELASTICSEARCH', 'main'], 'repo_template': '[{{repo_id}}]\nname=\{{repo_id}}\n\{% if mirror_list %}mirrorlist=\{{mirror_list}}\{% else %}baseurl=\{{base_url}}\{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ELASTICSEARCH', 'mirror_list': None}
> 2019-04-16 15:38:35,175 - File['/etc/yum.repos.d/ELASTICSEARCH.repo'] \{'content': '[elasticsearch-2.x]\nname=elasticsearch-2.x\nbaseurl=https://packages.elastic.co/elasticsearch/2.x/centos\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2019-04-16 15:38:35,176 - Repository['kibana-4.x'] \{'base_url': 'http://packages.elastic.co/kibana/4.5/centos', 'action': ['create'], 'components': [u'KIBANA', 'main'], 'repo_template': '[{{repo_id}}]\nname=\{{repo_id}}\n\{% if mirror_list %}mirrorlist=\{{mirror_list}}\{% else %}baseurl=\{{base_url}}\{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'KIBANA', 'mirror_list': None}
> 2019-04-16 15:38:35,179 - File['/etc/yum.repos.d/KIBANA.repo'] \{'content': '[kibana-4.x]\nname=kibana-4.x\nbaseurl=http://packages.elastic.co/kibana/4.5/centos\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2019-04-16 15:38:35,179 - Repository['METRON-0.4.1'] \{'base_url': 'file:///localrepo', 'action': ['create'], 'components': [u'METRON', 'main'], 'repo_template': '[{{repo_id}}]\nname=\{{repo_id}}\n\{% if mirror_list %}mirrorlist=\{{mirror_list}}\{% else %}baseurl=\{{base_url}}\{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'METRON', 'mirror_list': None}
> 2019-04-16 15:38:35,182 - File['/etc/yum.repos.d/METRON.repo'] \{'content': '[METRON-0.4.1]\nname=METRON-0.4.1\nbaseurl=file:///localrepo\n\npath=/\nenabled=1\ngpgcheck=0'}
> 2019-04-16 15:38:35,183 - Package['unzip'] \{'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2019-04-16 15:38:35,280 - Skipping installation of existing package unzip
> 2019-04-16 15:38:35,280 - Package['curl'] \{'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2019-04-16 15:38:35,290 - Skipping installation of existing package curl
> 2019-04-16 15:38:35,290 - Package['hdp-select'] \{'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2019-04-16 15:38:35,300 - Skipping installation of existing package hdp-select
> 2019-04-16 15:38:35,538 - checked_call['rpm -q --queryformat '%\{version}-%\{release}' hdp-select | sed -e 's/\.el[0-9]//g''] \{'stderr': -1}
> 2019-04-16 15:38:35,590 - checked_call returned (0, '2.5.0.0-1245', '')
> 2019-04-16 15:38:35,591 - Package['hadoop_2_5_0_0_1245-yarn'] \{'retry_on_repo_unavailability': False, 'retry_count': 5}
> 2019-04-16 15:38:35,680 - Installing package hadoop_2_5_0_0_1245-yarn ('/usr/bin/yum -d 0 -e 0 -y install hadoop_2_5_0_0_1245-yarn')
> 2019-04-16 15:38:40,403 - Execution of '/usr/bin/yum -d 0 -e 0 -y install hadoop_2_5_0_0_1245-yarn' returned 1. Error: Nothing to do
> 2019-04-16 15:38:40,403 - Failed to install package hadoop_2_5_0_0_1245-yarn. Executing '/usr/bin/yum clean metadata'
> 2019-04-16 15:38:40,672 - Retrying to install package hadoop_2_5_0_0_1245-yarn after 30 seconds
> 2019-04-16 15:39:58,855 - Could not find stack selector for stack: HDP



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)