You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ambari.apache.org by guxiaobo1982 <gu...@qq.com> on 2014/11/05 02:04:37 UTC

how to install a specific version of HDP using Ambari

Hi,


The current GUI can let user choose major versions of HDP to install, such as 2.1, 2.0, and will install the latest minor version, such as 2.1.7, but how can I choose to install a specific minor version such as 2.1.3, since I found 2.1.7 may have some bugs about hive.


Regards,


Xiaobo gu

Re: how to install a specific version of HDP using Ambari

Posted by Yusaku Sako <yu...@hortonworks.com>.
Right, but did you specify the base URLs for HDP 2.1.3 in Select Stack page
of the Install Wizard?
If you've done that and you still encountered a failure during "Install,
Start and Test" page, you should do what Jeff suggested:

> 1) Can you confirm what is in /etc/yum.repos.d/HDP.repo? This file is
generated by Ambari based on the Base URLs you enter and should reflect the
? > 2.1.3 urls that you entered during the wizard.
> 2) "yum clean all"
> 3) "yum info hadoop" and see what version it returns.

Yusaku

On Wed, Nov 5, 2014 at 5:29 PM, guxiaobo1982 <gu...@qq.com> wrote:

> this is not a HDP.repo file, but ambari.repo file, I installed ambari
> using the binaries got from apache site.
>
>
> [root@ambari yum.repos.d]# more ambari.repo
>
> [ambari-1.x]
>
> name=Ambari 1.x
>
> baseurl=http://public-repo-1.hortonworks.com/ambari/centos6/1.x/GA
>
> gpgcheck=1
>
> gpgkey=
> http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-K
>
> EY-Jenkins
>
> enabled=1
>
> priority=1
>
>
> [Updates-ambari-1.6.1]
>
> name=ambari-1.6.1 - Updates
>
> baseurl=
> http://public-repo-1.hortonworks.com/ambari/centos6/1.x/updates/1.6.1
>
> gpgcheck=1
>
> gpgkey=
> http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-K
>
> EY-Jenkins
>
> enabled=1
>
> priority=1
>
> [root@ambari yum.repos.d]#
>
>
> ------------------ Original ------------------
> *From: * "Jeff Sposetti";<je...@hortonworks.com>;
> *Send time:* Wednesday, Nov 5, 2014 9:18 PM
> *To:* "user@ambari.apache.org"<us...@ambari.apache.org>;
> *Subject: * Re: how to install a specific version of HDP using Ambari
>
> I looks like it's still trying to grab the 2.1.7 RPMs.
>
> 1) Can you confirm what is in /etc/yum.repos.d/HDP.repo ? This file is
> generated by Ambari based on the Base URLs you enter and should reflect the
> 2.1.3 urls that you entered during the wizard.
> 2) "yum clean all"
> 3) "yum info hadoop" and see what version it returns.
>
>
> On Wed, Nov 5, 2014 at 4:16 AM, guxiaobo1982 <gu...@qq.com> wrote:
>
>> I tried
>> http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and
>> http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.5.0 for
>> CENTOS6 both,
>>
>> can with error like this
>>
>>
>> stderr:   /var/lib/ambari-agent/data/errors-302.txt
>>
>> 2014-11-05 17:12:21,987 - Error while executing command 'install':
>> Traceback (most recent call last):
>>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
>>     method(env)
>>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install
>>     self.install_packages(env)
>>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages
>>     Package(name)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>>     self.env.run()
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
>>     self.run_action(resource, action)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
>>     provider_action()
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install
>>     self.install_package(package_name)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package
>>     shell.checked_call(cmd)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call
>>     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call
>>     raise Fail(err_msg)
>> Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:
>>   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>
>> stdout:   /var/lib/ambari-agent/data/output-302.txt
>>
>> 2014-11-05 17:12:13,977 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>> 2014-11-05 17:12:13,987 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
>> 2014-11-05 17:12:13,998 - Repository['HDP-2.1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
>> 2014-11-05 17:12:14,003 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
>> 2014-11-05 17:12:14,004 - Repository['HDP-UTILS-1.1.0.17'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
>> 2014-11-05 17:12:14,006 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
>> 2014-11-05 17:12:14,007 - Package['unzip'] {}
>> 2014-11-05 17:12:14,026 - Skipping installing existent package unzip
>> 2014-11-05 17:12:14,026 - Package['curl'] {}
>> 2014-11-05 17:12:14,048 - Skipping installing existent package curl
>> 2014-11-05 17:12:14,048 - Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']}
>> 2014-11-05 17:12:14,057 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] due to not_if
>> 2014-11-05 17:12:14,057 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']}
>> 2014-11-05 17:12:14,066 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if
>> 2014-11-05 17:12:14,066 - Group['hadoop'] {'ignore_failures': False}
>> 2014-11-05 17:12:14,067 - Modifying group hadoop
>> 2014-11-05 17:12:14,090 - Group['users'] {'ignore_failures': False}
>> 2014-11-05 17:12:14,091 - Modifying group users
>> 2014-11-05 17:12:14,116 - Group['users'] {'ignore_failures': False}
>> 2014-11-05 17:12:14,117 - Modifying group users
>> 2014-11-05 17:12:14,141 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>> 2014-11-05 17:12:14,141 - Modifying user ambari-qa
>> 2014-11-05 17:12:14,158 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>> 2014-11-05 17:12:14,159 - Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>> 2014-11-05 17:12:14,175 - Skipping Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
>> 2014-11-05 17:12:14,177 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2014-11-05 17:12:14,177 - Modifying user hbase
>> 2014-11-05 17:12:14,188 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>> 2014-11-05 17:12:14,189 - Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
>> 2014-11-05 17:12:14,200 - Skipping Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if
>> 2014-11-05 17:12:14,200 - Group['nagios'] {'ignore_failures': False}
>> 2014-11-05 17:12:14,200 - Modifying group nagios
>> 2014-11-05 17:12:14,221 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False}
>> 2014-11-05 17:12:14,221 - Modifying user nagios
>> 2014-11-05 17:12:14,232 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False}
>> 2014-11-05 17:12:14,232 - Modifying user oozie
>> 2014-11-05 17:12:14,242 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False}
>> 2014-11-05 17:12:14,243 - Modifying user hcat
>> 2014-11-05 17:12:14,252 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False}
>> 2014-11-05 17:12:14,253 - Modifying user hcat
>> 2014-11-05 17:12:14,262 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False}
>> 2014-11-05 17:12:14,262 - Modifying user hive
>> 2014-11-05 17:12:14,272 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False}
>> 2014-11-05 17:12:14,272 - Modifying user yarn
>> 2014-11-05 17:12:14,282 - Group['nobody'] {'ignore_failures': False}
>> 2014-11-05 17:12:14,282 - Modifying group nobody
>> 2014-11-05 17:12:14,305 - Group['nobody'] {'ignore_failures': False}
>> 2014-11-05 17:12:14,305 - Modifying group nobody
>> 2014-11-05 17:12:14,326 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
>> 2014-11-05 17:12:14,326 - Modifying user nobody
>> 2014-11-05 17:12:14,337 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
>> 2014-11-05 17:12:14,337 - Modifying user nobody
>> 2014-11-05 17:12:14,350 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2014-11-05 17:12:14,350 - Modifying user hdfs
>> 2014-11-05 17:12:14,366 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2014-11-05 17:12:14,368 - Modifying user mapred
>> 2014-11-05 17:12:14,387 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False}
>> 2014-11-05 17:12:14,388 - Modifying user zookeeper
>> 2014-11-05 17:12:14,405 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2014-11-05 17:12:14,405 - Modifying user storm
>> 2014-11-05 17:12:14,425 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2014-11-05 17:12:14,426 - Modifying user falcon
>> 2014-11-05 17:12:14,446 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>> 2014-11-05 17:12:14,446 - Modifying user tez
>> 2014-11-05 17:12:14,576 - Package['falcon'] {}
>> 2014-11-05 17:12:14,610 - Installing package falcon ('/usr/bin/yum -d 0 -e 0 -y install falcon')
>> 2014-11-05 17:12:21,987 - Error while executing command 'install':
>> Traceback (most recent call last):
>>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
>>     method(env)
>>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install
>>     self.install_packages(env)
>>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages
>>     Package(name)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>>     self.env.run()
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
>>     self.run_action(resource, action)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
>>     provider_action()
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install
>>     self.install_package(package_name)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package
>>     shell.checked_call(cmd)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call
>>     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)
>>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call
>>     raise Fail(err_msg)
>> Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:
>>   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>>
>>
>>
>>
>> ------------------ Original ------------------
>> *From: * "guxiaobo1982";<gu...@qq.com>;
>> *Send time:* Wednesday, Nov 5, 2014 3:37 PM
>> *To:* "user"<us...@ambari.apache.org>;
>> *Subject: * Re: how to install a specific version of HDP using Ambari
>>
>> Is there a mapping for HDP and HDP-UTILS minor versions, if I choose to
>> install HDP 2.1.5, which version of HDP-UTILS should I use?
>>
>>
>> ------------------ Original ------------------
>> *From: * "Jeff Sposetti";<je...@hortonworks.com>;
>> *Send time:* Wednesday, Nov 5, 2014 9:10 AM
>> *To:* "user@ambari.apache.org"<us...@ambari.apache.org>;
>> *Subject: * Re: how to install a specific version of HDP using Ambari
>>
>> You are correct that Ambari will grab the latest HDP 2.1.x maintenance
>> release repos if you are connected to the internet (for it to check for the
>> latest) and you select stack HDP 2.1.
>>
>> But if you want to install an older version of HDP 2.1.x, do the
>> following:
>>
>> 1) During install, on the Select Stack page, select HDP 2.1
>> 2) Expand the Advanced Repository Options section
>> 3) Enter the Base URL for the HDP 2.1.x version you wish to install
>> (overwriting the 2.1.7.0 repo entries that show up by default)
>>
>> Looking at the docs here:
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/index.html
>>
>> The Base URL for HDP is
>> http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and
>> HDP-UTILS is
>> http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6
>>
>>
>>
>> On Tue, Nov 4, 2014 at 8:04 PM, guxiaobo1982 <gu...@qq.com> wrote:
>>
>>> Hi,
>>>
>>> The current GUI can let user choose major versions of HDP to install,
>>> such as 2.1, 2.0, and will install the latest minor version, such as 2.1.7,
>>> but how can I choose to install a specific minor version such as 2.1.3,
>>> since I found 2.1.7 may have some bugs about hive.
>>>
>>> Regards,
>>>
>>> Xiaobo gu
>>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: how to install a specific version of HDP using Ambari

Posted by guxiaobo1982 <gu...@qq.com>.
this is not a HDP.repo file, but ambari.repo file, I installed ambari using the binaries got from apache site.




 
[root@ambari yum.repos.d]# more ambari.repo 
 
[ambari-1.x]
 
name=Ambari 1.x
 
baseurl=http://public-repo-1.hortonworks.com/ambari/centos6/1.x/GA
 
gpgcheck=1
 
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-K
 
EY-Jenkins
 
enabled=1
 
priority=1
 


 
[Updates-ambari-1.6.1]
 
name=ambari-1.6.1 - Updates
 
baseurl=http://public-repo-1.hortonworks.com/ambari/centos6/1.x/updates/1.6.1
 
gpgcheck=1
 
gpgkey=http://public-repo-1.hortonworks.com/ambari/centos6/RPM-GPG-KEY/RPM-GPG-K
 
EY-Jenkins
 
enabled=1
 
priority=1
 
[root@ambari yum.repos.d]# 





------------------ Original ------------------
From:  "Jeff Sposetti";<je...@hortonworks.com>;
Send time: Wednesday, Nov 5, 2014 9:18 PM
To: "user@ambari.apache.org"<us...@ambari.apache.org>; 

Subject:  Re: how to install a specific version of HDP using Ambari



I looks like it's still trying to grab the 2.1.7 RPMs.

1) Can you confirm what is in /etc/yum.repos.d/HDP.repo ? This file is generated by Ambari based on the Base URLs you enter and should reflect the 2.1.3 urls that you entered during the wizard.
2) "yum clean all" 

3) "yum info hadoop" and see what version it returns.



On Wed, Nov 5, 2014 at 4:16 AM, guxiaobo1982 <gu...@qq.com> wrote:
I tried http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.5.0 for CENTOS6 both,


can with error like this




stderr:   /var/lib/ambari-agent/data/errors-302.txt
2014-11-05 17:12:21,987 - Error while executing command 'install': Traceback (most recent call last):   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute     method(env)   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install     self.install_packages(env)   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages     Package(name)   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__     self.env.run()   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run     self.run_action(resource, action)   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action     provider_action()   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install     self.install_package(package_name)   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package     shell.checked_call(cmd)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call     raise Fail(err_msg) Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
stdout:   /var/lib/ambari-agent/data/output-302.txt
2014-11-05 17:12:13,977 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']} 2014-11-05 17:12:13,987 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if 2014-11-05 17:12:13,998 - Repository['HDP-2.1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None} 2014-11-05 17:12:14,003 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')} 2014-11-05 17:12:14,004 - Repository['HDP-UTILS-1.1.0.17'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2014-11-05 17:12:14,006 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')} 2014-11-05 17:12:14,007 - Package['unzip'] {} 2014-11-05 17:12:14,026 - Skipping installing existent package unzip 2014-11-05 17:12:14,026 - Package['curl'] {} 2014-11-05 17:12:14,048 - Skipping installing existent package curl 2014-11-05 17:12:14,048 - Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']} 2014-11-05 17:12:14,057 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] due to not_if 2014-11-05 17:12:14,057 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']} 2014-11-05 17:12:14,066 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if 2014-11-05 17:12:14,066 - Group['hadoop'] {'ignore_failures': False} 2014-11-05 17:12:14,067 - Modifying group hadoop 2014-11-05 17:12:14,090 - Group['users'] {'ignore_failures': False} 2014-11-05 17:12:14,091 - Modifying group users 2014-11-05 17:12:14,116 - Group['users'] {'ignore_failures': False} 2014-11-05 17:12:14,117 - Modifying group users 2014-11-05 17:12:14,141 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2014-11-05 17:12:14,141 - Modifying user ambari-qa 2014-11-05 17:12:14,158 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2014-11-05 17:12:14,159 - Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'} 2014-11-05 17:12:14,175 - Skipping Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if 2014-11-05 17:12:14,177 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,177 - Modifying user hbase 2014-11-05 17:12:14,188 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2014-11-05 17:12:14,189 - Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'} 2014-11-05 17:12:14,200 - Skipping Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if 2014-11-05 17:12:14,200 - Group['nagios'] {'ignore_failures': False} 2014-11-05 17:12:14,200 - Modifying group nagios 2014-11-05 17:12:14,221 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False} 2014-11-05 17:12:14,221 - Modifying user nagios 2014-11-05 17:12:14,232 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,232 - Modifying user oozie 2014-11-05 17:12:14,242 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,243 - Modifying user hcat 2014-11-05 17:12:14,252 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,253 - Modifying user hcat 2014-11-05 17:12:14,262 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,262 - Modifying user hive 2014-11-05 17:12:14,272 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,272 - Modifying user yarn 2014-11-05 17:12:14,282 - Group['nobody'] {'ignore_failures': False} 2014-11-05 17:12:14,282 - Modifying group nobody 2014-11-05 17:12:14,305 - Group['nobody'] {'ignore_failures': False} 2014-11-05 17:12:14,305 - Modifying group nobody 2014-11-05 17:12:14,326 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']} 2014-11-05 17:12:14,326 - Modifying user nobody 2014-11-05 17:12:14,337 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']} 2014-11-05 17:12:14,337 - Modifying user nobody 2014-11-05 17:12:14,350 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,350 - Modifying user hdfs 2014-11-05 17:12:14,366 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,368 - Modifying user mapred 2014-11-05 17:12:14,387 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,388 - Modifying user zookeeper 2014-11-05 17:12:14,405 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,405 - Modifying user storm 2014-11-05 17:12:14,425 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,426 - Modifying user falcon 2014-11-05 17:12:14,446 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2014-11-05 17:12:14,446 - Modifying user tez 2014-11-05 17:12:14,576 - Package['falcon'] {} 2014-11-05 17:12:14,610 - Installing package falcon ('/usr/bin/yum -d 0 -e 0 -y install falcon') 2014-11-05 17:12:21,987 - Error while executing command 'install': Traceback (most recent call last):   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute     method(env)   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install     self.install_packages(env)   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages     Package(name)   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__     self.env.run()   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run     self.run_action(resource, action)   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action     provider_action()   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install     self.install_package(package_name)   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package     shell.checked_call(cmd)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call     raise Fail(err_msg) Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.




------------------ Original ------------------
From:  "guxiaobo1982";<gu...@qq.com>;
Send time: Wednesday, Nov 5, 2014 3:37 PM
To: "user"<us...@ambari.apache.org>; 

Subject:  Re:  how to install a specific version of HDP using Ambari



Is there a mapping for HDP and HDP-UTILS minor versions, if I choose to install HDP 2.1.5, which version of HDP-UTILS should I use?




------------------ Original ------------------
From:  "Jeff Sposetti";<je...@hortonworks.com>;
Send time: Wednesday, Nov 5, 2014 9:10 AM
To: "user@ambari.apache.org"<us...@ambari.apache.org>; 

Subject:  Re: how to install a specific version of HDP using Ambari



You are correct that Ambari will grab the latest HDP 2.1.x maintenance release repos if you are connected to the internet (for it to check for the latest) and you select stack HDP 2.1.

But if you want to install an older version of HDP 2.1.x, do the following:

1) During install, on the Select Stack page, select HDP 2.1
2) Expand the Advanced Repository Options section

3) Enter the Base URL for the HDP 2.1.x version you wish to install (overwriting the 2.1.7.0 repo entries that show up by default)



Looking at the docs here:


http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/index.html



The Base URL for HDP is http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and HDP-UTILS is http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6



On Tue, Nov 4, 2014 at 8:04 PM, guxiaobo1982 <gu...@qq.com> wrote:
Hi,


The current GUI can let user choose major versions of HDP to install, such as 2.1, 2.0, and will install the latest minor version, such as 2.1.7, but how can I choose to install a specific minor version such as 2.1.3, since I found 2.1.7 may have some bugs about hive.


Regards,


Xiaobo gu




 
 CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.




 
 CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.

Re: how to install a specific version of HDP using Ambari

Posted by Jeff Sposetti <je...@hortonworks.com>.
I looks like it's still trying to grab the 2.1.7 RPMs.

1) Can you confirm what is in /etc/yum.repos.d/HDP.repo ? This file is
generated by Ambari based on the Base URLs you enter and should reflect the
2.1.3 urls that you entered during the wizard.
2) "yum clean all"
3) "yum info hadoop" and see what version it returns.


On Wed, Nov 5, 2014 at 4:16 AM, guxiaobo1982 <gu...@qq.com> wrote:

> I tried
> http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and
> http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.5.0 for
> CENTOS6 both,
>
> can with error like this
>
>
> stderr:   /var/lib/ambari-agent/data/errors-302.txt
>
> 2014-11-05 17:12:21,987 - Error while executing command 'install':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install
>     self.install_packages(env)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages
>     Package(name)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install
>     self.install_package(package_name)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package
>     shell.checked_call(cmd)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call
>     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call
>     raise Fail(err_msg)
> Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:
>   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>
> stdout:   /var/lib/ambari-agent/data/output-302.txt
>
> 2014-11-05 17:12:13,977 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
> 2014-11-05 17:12:13,987 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
> 2014-11-05 17:12:13,998 - Repository['HDP-2.1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
> 2014-11-05 17:12:14,003 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
> 2014-11-05 17:12:14,004 - Repository['HDP-UTILS-1.1.0.17'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
> 2014-11-05 17:12:14,006 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
> 2014-11-05 17:12:14,007 - Package['unzip'] {}
> 2014-11-05 17:12:14,026 - Skipping installing existent package unzip
> 2014-11-05 17:12:14,026 - Package['curl'] {}
> 2014-11-05 17:12:14,048 - Skipping installing existent package curl
> 2014-11-05 17:12:14,048 - Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']}
> 2014-11-05 17:12:14,057 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] due to not_if
> 2014-11-05 17:12:14,057 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']}
> 2014-11-05 17:12:14,066 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if
> 2014-11-05 17:12:14,066 - Group['hadoop'] {'ignore_failures': False}
> 2014-11-05 17:12:14,067 - Modifying group hadoop
> 2014-11-05 17:12:14,090 - Group['users'] {'ignore_failures': False}
> 2014-11-05 17:12:14,091 - Modifying group users
> 2014-11-05 17:12:14,116 - Group['users'] {'ignore_failures': False}
> 2014-11-05 17:12:14,117 - Modifying group users
> 2014-11-05 17:12:14,141 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-11-05 17:12:14,141 - Modifying user ambari-qa
> 2014-11-05 17:12:14,158 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-11-05 17:12:14,159 - Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
> 2014-11-05 17:12:14,175 - Skipping Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
> 2014-11-05 17:12:14,177 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-11-05 17:12:14,177 - Modifying user hbase
> 2014-11-05 17:12:14,188 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-11-05 17:12:14,189 - Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
> 2014-11-05 17:12:14,200 - Skipping Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if
> 2014-11-05 17:12:14,200 - Group['nagios'] {'ignore_failures': False}
> 2014-11-05 17:12:14,200 - Modifying group nagios
> 2014-11-05 17:12:14,221 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False}
> 2014-11-05 17:12:14,221 - Modifying user nagios
> 2014-11-05 17:12:14,232 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False}
> 2014-11-05 17:12:14,232 - Modifying user oozie
> 2014-11-05 17:12:14,242 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False}
> 2014-11-05 17:12:14,243 - Modifying user hcat
> 2014-11-05 17:12:14,252 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False}
> 2014-11-05 17:12:14,253 - Modifying user hcat
> 2014-11-05 17:12:14,262 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False}
> 2014-11-05 17:12:14,262 - Modifying user hive
> 2014-11-05 17:12:14,272 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False}
> 2014-11-05 17:12:14,272 - Modifying user yarn
> 2014-11-05 17:12:14,282 - Group['nobody'] {'ignore_failures': False}
> 2014-11-05 17:12:14,282 - Modifying group nobody
> 2014-11-05 17:12:14,305 - Group['nobody'] {'ignore_failures': False}
> 2014-11-05 17:12:14,305 - Modifying group nobody
> 2014-11-05 17:12:14,326 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
> 2014-11-05 17:12:14,326 - Modifying user nobody
> 2014-11-05 17:12:14,337 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
> 2014-11-05 17:12:14,337 - Modifying user nobody
> 2014-11-05 17:12:14,350 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-11-05 17:12:14,350 - Modifying user hdfs
> 2014-11-05 17:12:14,366 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-11-05 17:12:14,368 - Modifying user mapred
> 2014-11-05 17:12:14,387 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False}
> 2014-11-05 17:12:14,388 - Modifying user zookeeper
> 2014-11-05 17:12:14,405 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-11-05 17:12:14,405 - Modifying user storm
> 2014-11-05 17:12:14,425 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-11-05 17:12:14,426 - Modifying user falcon
> 2014-11-05 17:12:14,446 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-11-05 17:12:14,446 - Modifying user tez
> 2014-11-05 17:12:14,576 - Package['falcon'] {}
> 2014-11-05 17:12:14,610 - Installing package falcon ('/usr/bin/yum -d 0 -e 0 -y install falcon')
> 2014-11-05 17:12:21,987 - Error while executing command 'install':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install
>     self.install_packages(env)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages
>     Package(name)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install
>     self.install_package(package_name)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package
>     shell.checked_call(cmd)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call
>     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call
>     raise Fail(err_msg)
> Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:
>   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
>
>
>
>
> ------------------ Original ------------------
> *From: * "guxiaobo1982";<gu...@qq.com>;
> *Send time:* Wednesday, Nov 5, 2014 3:37 PM
> *To:* "user"<us...@ambari.apache.org>;
> *Subject: * Re: how to install a specific version of HDP using Ambari
>
> Is there a mapping for HDP and HDP-UTILS minor versions, if I choose to
> install HDP 2.1.5, which version of HDP-UTILS should I use?
>
>
> ------------------ Original ------------------
> *From: * "Jeff Sposetti";<je...@hortonworks.com>;
> *Send time:* Wednesday, Nov 5, 2014 9:10 AM
> *To:* "user@ambari.apache.org"<us...@ambari.apache.org>;
> *Subject: * Re: how to install a specific version of HDP using Ambari
>
> You are correct that Ambari will grab the latest HDP 2.1.x maintenance
> release repos if you are connected to the internet (for it to check for the
> latest) and you select stack HDP 2.1.
>
> But if you want to install an older version of HDP 2.1.x, do the following:
>
> 1) During install, on the Select Stack page, select HDP 2.1
> 2) Expand the Advanced Repository Options section
> 3) Enter the Base URL for the HDP 2.1.x version you wish to install
> (overwriting the 2.1.7.0 repo entries that show up by default)
>
> Looking at the docs here:
>
> http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/index.html
>
> The Base URL for HDP is
> http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and
> HDP-UTILS is
> http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6
>
>
>
> On Tue, Nov 4, 2014 at 8:04 PM, guxiaobo1982 <gu...@qq.com> wrote:
>
>> Hi,
>>
>> The current GUI can let user choose major versions of HDP to install,
>> such as 2.1, 2.0, and will install the latest minor version, such as 2.1.7,
>> but how can I choose to install a specific minor version such as 2.1.3,
>> since I found 2.1.7 may have some bugs about hive.
>>
>> Regards,
>>
>> Xiaobo gu
>>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: how to install a specific version of HDP using Ambari

Posted by guxiaobo1982 <gu...@qq.com>.
I tried http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.5.0 for CENTOS6 both,


can with error like this




stderr:   /var/lib/ambari-agent/data/errors-302.txt
2014-11-05 17:12:21,987 - Error while executing command 'install': Traceback (most recent call last):   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute     method(env)   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install     self.install_packages(env)   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages     Package(name)   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__     self.env.run()   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run     self.run_action(resource, action)   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action     provider_action()   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install     self.install_package(package_name)   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package     shell.checked_call(cmd)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call     raise Fail(err_msg) Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.
stdout:   /var/lib/ambari-agent/data/output-302.txt
2014-11-05 17:12:13,977 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']} 2014-11-05 17:12:13,987 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://ambari.bh.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if 2014-11-05 17:12:13,998 - Repository['HDP-2.1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None} 2014-11-05 17:12:14,003 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')} 2014-11-05 17:12:14,004 - Repository['HDP-UTILS-1.1.0.17'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2014-11-05 17:12:14,006 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')} 2014-11-05 17:12:14,007 - Package['unzip'] {} 2014-11-05 17:12:14,026 - Skipping installing existent package unzip 2014-11-05 17:12:14,026 - Package['curl'] {} 2014-11-05 17:12:14,048 - Skipping installing existent package curl 2014-11-05 17:12:14,048 - Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] {'environment': ..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']} 2014-11-05 17:12:14,057 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/ ;   curl -kf -x ""   --retry 10 http://ambari.bh.com:8080/resources//jdk-7u45-linux-x64.tar.gz -o /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz'] due to not_if 2014-11-05 17:12:14,057 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_45/bin/java', 'path': ['/bin', '/usr/bin/']} 2014-11-05 17:12:14,066 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf /tmp/HDP-artifacts//jdk-7u45-linux-x64.tar.gz > /dev/null 2>&1'] due to not_if 2014-11-05 17:12:14,066 - Group['hadoop'] {'ignore_failures': False} 2014-11-05 17:12:14,067 - Modifying group hadoop 2014-11-05 17:12:14,090 - Group['users'] {'ignore_failures': False} 2014-11-05 17:12:14,091 - Modifying group users 2014-11-05 17:12:14,116 - Group['users'] {'ignore_failures': False} 2014-11-05 17:12:14,117 - Modifying group users 2014-11-05 17:12:14,141 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2014-11-05 17:12:14,141 - Modifying user ambari-qa 2014-11-05 17:12:14,158 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2014-11-05 17:12:14,159 - Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'} 2014-11-05 17:12:14,175 - Skipping Execute['/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if 2014-11-05 17:12:14,177 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,177 - Modifying user hbase 2014-11-05 17:12:14,188 - File['/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2014-11-05 17:12:14,189 - Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'} 2014-11-05 17:12:14,200 - Skipping Execute['/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if 2014-11-05 17:12:14,200 - Group['nagios'] {'ignore_failures': False} 2014-11-05 17:12:14,200 - Modifying group nagios 2014-11-05 17:12:14,221 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False} 2014-11-05 17:12:14,221 - Modifying user nagios 2014-11-05 17:12:14,232 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,232 - Modifying user oozie 2014-11-05 17:12:14,242 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,243 - Modifying user hcat 2014-11-05 17:12:14,252 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,253 - Modifying user hcat 2014-11-05 17:12:14,262 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,262 - Modifying user hive 2014-11-05 17:12:14,272 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,272 - Modifying user yarn 2014-11-05 17:12:14,282 - Group['nobody'] {'ignore_failures': False} 2014-11-05 17:12:14,282 - Modifying group nobody 2014-11-05 17:12:14,305 - Group['nobody'] {'ignore_failures': False} 2014-11-05 17:12:14,305 - Modifying group nobody 2014-11-05 17:12:14,326 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']} 2014-11-05 17:12:14,326 - Modifying user nobody 2014-11-05 17:12:14,337 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']} 2014-11-05 17:12:14,337 - Modifying user nobody 2014-11-05 17:12:14,350 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,350 - Modifying user hdfs 2014-11-05 17:12:14,366 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,368 - Modifying user mapred 2014-11-05 17:12:14,387 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False} 2014-11-05 17:12:14,388 - Modifying user zookeeper 2014-11-05 17:12:14,405 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,405 - Modifying user storm 2014-11-05 17:12:14,425 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2014-11-05 17:12:14,426 - Modifying user falcon 2014-11-05 17:12:14,446 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2014-11-05 17:12:14,446 - Modifying user tez 2014-11-05 17:12:14,576 - Package['falcon'] {} 2014-11-05 17:12:14,610 - Installing package falcon ('/usr/bin/yum -d 0 -e 0 -y install falcon') 2014-11-05 17:12:21,987 - Error while executing command 'install': Traceback (most recent call last):   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute     method(env)   File "/var/lib/ambari-agent/cache/stacks/HDP/2.1/services/FALCON/package/scripts/falcon_client.py", line 25, in install     self.install_packages(env)   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 167, in install_packages     Package(name)   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__     self.env.run()   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run     self.run_action(resource, action)   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action     provider_action()   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 40, in action_install     self.install_package(package_name)   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 36, in install_package     shell.checked_call(cmd)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 35, in checked_call     return _call(command, logoutput, True, cwd, env, preexec_fn, user, wait_for_finish, timeout)   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 90, in _call     raise Fail(err_msg) Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install falcon' returned 1. Error Downloading Packages:   hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-yarn-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   bigtop-jsvc-1.0.10-1.el6.x86_64: failure: bigtop-jsvc/bigtop-jsvc-1.0.10-1.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-client-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-hdfs-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64: failure: hadoop/hadoop-mapreduce-2.4.0.2.1.7.0-784.el6.x86_64.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   zookeeper-3.4.5.2.1.7.0-784.el6.noarch: failure: zookeeper/zookeeper-3.4.5.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.   falcon-0.5.0.2.1.7.0-784.el6.noarch: failure: falcon/falcon-0.5.0.2.1.7.0-784.el6.noarch.rpm from HDP-2.1: [Errno 256] No more mirrors to try.




------------------ Original ------------------
From:  "guxiaobo1982";<gu...@qq.com>;
Send time: Wednesday, Nov 5, 2014 3:37 PM
To: "user"<us...@ambari.apache.org>; 

Subject:  Re:  how to install a specific version of HDP using Ambari



Is there a mapping for HDP and HDP-UTILS minor versions, if I choose to install HDP 2.1.5, which version of HDP-UTILS should I use?




------------------ Original ------------------
From:  "Jeff Sposetti";<je...@hortonworks.com>;
Send time: Wednesday, Nov 5, 2014 9:10 AM
To: "user@ambari.apache.org"<us...@ambari.apache.org>; 

Subject:  Re: how to install a specific version of HDP using Ambari



You are correct that Ambari will grab the latest HDP 2.1.x maintenance release repos if you are connected to the internet (for it to check for the latest) and you select stack HDP 2.1.

But if you want to install an older version of HDP 2.1.x, do the following:

1) During install, on the Select Stack page, select HDP 2.1
2) Expand the Advanced Repository Options section

3) Enter the Base URL for the HDP 2.1.x version you wish to install (overwriting the 2.1.7.0 repo entries that show up by default)



Looking at the docs here:


http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/index.html



The Base URL for HDP is http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and HDP-UTILS is http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6



On Tue, Nov 4, 2014 at 8:04 PM, guxiaobo1982 <gu...@qq.com> wrote:
Hi,


The current GUI can let user choose major versions of HDP to install, such as 2.1, 2.0, and will install the latest minor version, such as 2.1.7, but how can I choose to install a specific minor version such as 2.1.3, since I found 2.1.7 may have some bugs about hive.


Regards,


Xiaobo gu




 
 CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.

Re: how to install a specific version of HDP using Ambari

Posted by guxiaobo1982 <gu...@qq.com>.
Is there a mapping for HDP and HDP-UTILS minor versions, if I choose to install HDP 2.1.5, which version of HDP-UTILS should I use?




------------------ Original ------------------
From:  "Jeff Sposetti";<je...@hortonworks.com>;
Send time: Wednesday, Nov 5, 2014 9:10 AM
To: "user@ambari.apache.org"<us...@ambari.apache.org>; 

Subject:  Re: how to install a specific version of HDP using Ambari



You are correct that Ambari will grab the latest HDP 2.1.x maintenance release repos if you are connected to the internet (for it to check for the latest) and you select stack HDP 2.1.

But if you want to install an older version of HDP 2.1.x, do the following:

1) During install, on the Select Stack page, select HDP 2.1
2) Expand the Advanced Repository Options section

3) Enter the Base URL for the HDP 2.1.x version you wish to install (overwriting the 2.1.7.0 repo entries that show up by default)



Looking at the docs here:


http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/index.html



The Base URL for HDP is http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and HDP-UTILS is http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6



On Tue, Nov 4, 2014 at 8:04 PM, guxiaobo1982 <gu...@qq.com> wrote:
Hi,


The current GUI can let user choose major versions of HDP to install, such as 2.1, 2.0, and will install the latest minor version, such as 2.1.7, but how can I choose to install a specific minor version such as 2.1.3, since I found 2.1.7 may have some bugs about hive.


Regards,


Xiaobo gu




 
 CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.

Re: how to install a specific version of HDP using Ambari

Posted by Jeff Sposetti <je...@hortonworks.com>.
You are correct that Ambari will grab the latest HDP 2.1.x maintenance
release repos if you are connected to the internet (for it to check for the
latest) and you select stack HDP 2.1.

But if you want to install an older version of HDP 2.1.x, do the following:

1) During install, on the Select Stack page, select HDP 2.1
2) Expand the Advanced Repository Options section
3) Enter the Base URL for the HDP 2.1.x version you wish to install
(overwriting the 2.1.7.0 repo entries that show up by default)

Looking at the docs here:

http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/index.html

The Base URL for HDP is
http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.1.3.0 and
HDP-UTILS is
http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.17/repos/centos6



On Tue, Nov 4, 2014 at 8:04 PM, guxiaobo1982 <gu...@qq.com> wrote:

> Hi,
>
> The current GUI can let user choose major versions of HDP to install, such
> as 2.1, 2.0, and will install the latest minor version, such as 2.1.7, but
> how can I choose to install a specific minor version such as 2.1.3, since I
> found 2.1.7 may have some bugs about hive.
>
> Regards,
>
> Xiaobo gu
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.