You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by Andrew Onischuk <ao...@hortonworks.com> on 2014/10/31 19:43:30 UTC
Review Request 27437: Install on a 5 node cluster fails with link
creation for libsnappy.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/27437/
-----------------------------------------------------------
Review request for Ambari and Myroslav Papirkovskyy.
Bugs: AMBARI-8076
https://issues.apache.org/jira/browse/AMBARI-8076
Repository: ambari
Description
-------
Install on a 5 node cluster fails with link creation for libsnappy.
2014-10-29 22:38:09,249 - Error while executing command 'start':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
setup_hadoop()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
install_snappy()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
raise ex
Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
stdout: /var/lib/ambari-agent/data/output-78.txt
2014-10-29 22:38:08,708 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2014-10-29 22:38:08,725 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
2014-10-29 22:38:08,726 - Group['hadoop'] {'ignore_failures': False}
2014-10-29 22:38:08,727 - Modifying group hadoop
2014-10-29 22:38:08,788 - Group['nobody'] {'ignore_failures': False}
2014-10-29 22:38:08,788 - Modifying group nobody
2014-10-29 22:38:08,824 - Group['users'] {'ignore_failures': False}
2014-10-29 22:38:08,825 - Modifying group users
2014-10-29 22:38:08,858 - Group['nagios'] {'ignore_failures': False}
2014-10-29 22:38:08,858 - Modifying group nagios
2014-10-29 22:38:08,891 - Group['knox'] {'ignore_failures': False}
2014-10-29 22:38:08,892 - Modifying group knox
2014-10-29 22:38:08,916 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
2014-10-29 22:38:08,916 - Modifying user nobody
2014-10-29 22:38:08,929 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,930 - Modifying user hive
2014-10-29 22:38:08,942 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2014-10-29 22:38:08,942 - Modifying user oozie
2014-10-29 22:38:08,955 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,955 - Modifying user nagios
2014-10-29 22:38:08,968 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2014-10-29 22:38:08,968 - Modifying user ambari-qa
2014-10-29 22:38:08,981 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,981 - Modifying user flume
2014-10-29 22:38:08,993 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,994 - Modifying user hdfs
2014-10-29 22:38:09,006 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,006 - Modifying user knox
2014-10-29 22:38:09,019 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,019 - Modifying user storm
2014-10-29 22:38:09,031 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,032 - Modifying user mapred
2014-10-29 22:38:09,044 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,044 - Modifying user hbase
2014-10-29 22:38:09,057 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2014-10-29 22:38:09,057 - Modifying user tez
2014-10-29 22:38:09,070 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,070 - Modifying user zookeeper
2014-10-29 22:38:09,082 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,083 - Modifying user kafka
2014-10-29 22:38:09,095 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,095 - Modifying user falcon
2014-10-29 22:38:09,108 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,108 - Modifying user sqoop
2014-10-29 22:38:09,121 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,122 - Modifying user yarn
2014-10-29 22:38:09,134 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,135 - Modifying user hcat
2014-10-29 22:38:09,147 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2014-10-29 22:38:09,149 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2014-10-29 22:38:09,160 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
2014-10-29 22:38:09,161 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2014-10-29 22:38:09,162 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
2014-10-29 22:38:09,173 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if
2014-10-29 22:38:09,174 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2014-10-29 22:38:09,174 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2014-10-29 22:38:09,185 - Skipping Link['/etc/hadoop/conf'] due to not_if
2014-10-29 22:38:09,197 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
2014-10-29 22:38:09,207 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
2014-10-29 22:38:09,235 - Execute['mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so'] {}
2014-10-29 22:38:09,249 - Error while executing command 'start':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
setup_hadoop()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
install_snappy()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
raise ex
Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
Looks like this happens if there is no hadoop related libary installed on the
host.
Diffs
-----
ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/params.py fc66011
ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/shared_initialization.py 81abf65
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/params.py e275924
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py f70eee8
ambari-server/src/test/python/stacks/1.3.2/hooks/before-START/test_before_start.py d569a95
ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 54b956e
Diff: https://reviews.apache.org/r/27437/diff/
Testing
-------
mvn clean test
Thanks,
Andrew Onischuk
Re: Review Request 27437: Install on a 5 node cluster fails with link
creation for libsnappy.
Posted by Vitalyi Brodetskyi <vb...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/27437/#review59381
-----------------------------------------------------------
Ship it!
Ship It!
- Vitalyi Brodetskyi
On Жов. 31, 2014, 6:43 після полудня, Andrew Onischuk wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/27437/
> -----------------------------------------------------------
>
> (Updated Жов. 31, 2014, 6:43 після полудня)
>
>
> Review request for Ambari, Myroslav Papirkovskyy and Vitalyi Brodetskyi.
>
>
> Bugs: AMBARI-8076
> https://issues.apache.org/jira/browse/AMBARI-8076
>
>
> Repository: ambari
>
>
> Description
> -------
>
> Install on a 5 node cluster fails with link creation for libsnappy.
>
>
>
>
> 2014-10-29 22:38:09,249 - Error while executing command 'start':
> Traceback (most recent call last):
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
> setup_hadoop()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
> install_snappy()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
> format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
> self.env.run()
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
> self.run_action(resource, action)
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
> provider_action()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
> raise ex
> Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
> ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
> stdout: /var/lib/ambari-agent/data/output-78.txt
>
> 2014-10-29 22:38:08,708 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
> 2014-10-29 22:38:08,725 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
> 2014-10-29 22:38:08,726 - Group['hadoop'] {'ignore_failures': False}
> 2014-10-29 22:38:08,727 - Modifying group hadoop
> 2014-10-29 22:38:08,788 - Group['nobody'] {'ignore_failures': False}
> 2014-10-29 22:38:08,788 - Modifying group nobody
> 2014-10-29 22:38:08,824 - Group['users'] {'ignore_failures': False}
> 2014-10-29 22:38:08,825 - Modifying group users
> 2014-10-29 22:38:08,858 - Group['nagios'] {'ignore_failures': False}
> 2014-10-29 22:38:08,858 - Modifying group nagios
> 2014-10-29 22:38:08,891 - Group['knox'] {'ignore_failures': False}
> 2014-10-29 22:38:08,892 - Modifying group knox
> 2014-10-29 22:38:08,916 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
> 2014-10-29 22:38:08,916 - Modifying user nobody
> 2014-10-29 22:38:08,929 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,930 - Modifying user hive
> 2014-10-29 22:38:08,942 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-10-29 22:38:08,942 - Modifying user oozie
> 2014-10-29 22:38:08,955 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,955 - Modifying user nagios
> 2014-10-29 22:38:08,968 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-10-29 22:38:08,968 - Modifying user ambari-qa
> 2014-10-29 22:38:08,981 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,981 - Modifying user flume
> 2014-10-29 22:38:08,993 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,994 - Modifying user hdfs
> 2014-10-29 22:38:09,006 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,006 - Modifying user knox
> 2014-10-29 22:38:09,019 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,019 - Modifying user storm
> 2014-10-29 22:38:09,031 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,032 - Modifying user mapred
> 2014-10-29 22:38:09,044 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,044 - Modifying user hbase
> 2014-10-29 22:38:09,057 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-10-29 22:38:09,057 - Modifying user tez
> 2014-10-29 22:38:09,070 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,070 - Modifying user zookeeper
> 2014-10-29 22:38:09,082 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,083 - Modifying user kafka
> 2014-10-29 22:38:09,095 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,095 - Modifying user falcon
> 2014-10-29 22:38:09,108 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,108 - Modifying user sqoop
> 2014-10-29 22:38:09,121 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,122 - Modifying user yarn
> 2014-10-29 22:38:09,134 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,135 - Modifying user hcat
> 2014-10-29 22:38:09,147 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-10-29 22:38:09,149 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
> 2014-10-29 22:38:09,160 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
> 2014-10-29 22:38:09,161 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-10-29 22:38:09,162 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
> 2014-10-29 22:38:09,173 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if
> 2014-10-29 22:38:09,174 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
> 2014-10-29 22:38:09,174 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
> 2014-10-29 22:38:09,185 - Skipping Link['/etc/hadoop/conf'] due to not_if
> 2014-10-29 22:38:09,197 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
> 2014-10-29 22:38:09,207 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
> 2014-10-29 22:38:09,235 - Execute['mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so'] {}
> 2014-10-29 22:38:09,249 - Error while executing command 'start':
> Traceback (most recent call last):
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
> setup_hadoop()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
> install_snappy()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
> format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
> self.env.run()
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
> self.run_action(resource, action)
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
> provider_action()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
> raise ex
> Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
> ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
>
>
>
> Looks like this happens if there is no hadoop related libary installed on the
> host.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/params.py fc66011
> ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/shared_initialization.py 81abf65
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/params.py e275924
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py f70eee8
> ambari-server/src/test/python/stacks/1.3.2/hooks/before-START/test_before_start.py d569a95
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 54b956e
>
> Diff: https://reviews.apache.org/r/27437/diff/
>
>
> Testing
> -------
>
> mvn clean test
>
>
> Thanks,
>
> Andrew Onischuk
>
>
Re: Review Request 27437: Install on a 5 node cluster fails with link
creation for libsnappy.
Posted by Myroslav Papirkovskyy <mp...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/27437/#review59380
-----------------------------------------------------------
Ship it!
Ship It!
- Myroslav Papirkovskyy
On Жов. 31, 2014, 8:43 після полудня, Andrew Onischuk wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/27437/
> -----------------------------------------------------------
>
> (Updated Жов. 31, 2014, 8:43 після полудня)
>
>
> Review request for Ambari, Myroslav Papirkovskyy and Vitalyi Brodetskyi.
>
>
> Bugs: AMBARI-8076
> https://issues.apache.org/jira/browse/AMBARI-8076
>
>
> Repository: ambari
>
>
> Description
> -------
>
> Install on a 5 node cluster fails with link creation for libsnappy.
>
>
>
>
> 2014-10-29 22:38:09,249 - Error while executing command 'start':
> Traceback (most recent call last):
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
> setup_hadoop()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
> install_snappy()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
> format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
> self.env.run()
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
> self.run_action(resource, action)
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
> provider_action()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
> raise ex
> Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
> ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
> stdout: /var/lib/ambari-agent/data/output-78.txt
>
> 2014-10-29 22:38:08,708 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
> 2014-10-29 22:38:08,725 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
> 2014-10-29 22:38:08,726 - Group['hadoop'] {'ignore_failures': False}
> 2014-10-29 22:38:08,727 - Modifying group hadoop
> 2014-10-29 22:38:08,788 - Group['nobody'] {'ignore_failures': False}
> 2014-10-29 22:38:08,788 - Modifying group nobody
> 2014-10-29 22:38:08,824 - Group['users'] {'ignore_failures': False}
> 2014-10-29 22:38:08,825 - Modifying group users
> 2014-10-29 22:38:08,858 - Group['nagios'] {'ignore_failures': False}
> 2014-10-29 22:38:08,858 - Modifying group nagios
> 2014-10-29 22:38:08,891 - Group['knox'] {'ignore_failures': False}
> 2014-10-29 22:38:08,892 - Modifying group knox
> 2014-10-29 22:38:08,916 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
> 2014-10-29 22:38:08,916 - Modifying user nobody
> 2014-10-29 22:38:08,929 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,930 - Modifying user hive
> 2014-10-29 22:38:08,942 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-10-29 22:38:08,942 - Modifying user oozie
> 2014-10-29 22:38:08,955 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,955 - Modifying user nagios
> 2014-10-29 22:38:08,968 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-10-29 22:38:08,968 - Modifying user ambari-qa
> 2014-10-29 22:38:08,981 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,981 - Modifying user flume
> 2014-10-29 22:38:08,993 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:08,994 - Modifying user hdfs
> 2014-10-29 22:38:09,006 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,006 - Modifying user knox
> 2014-10-29 22:38:09,019 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,019 - Modifying user storm
> 2014-10-29 22:38:09,031 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,032 - Modifying user mapred
> 2014-10-29 22:38:09,044 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,044 - Modifying user hbase
> 2014-10-29 22:38:09,057 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2014-10-29 22:38:09,057 - Modifying user tez
> 2014-10-29 22:38:09,070 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,070 - Modifying user zookeeper
> 2014-10-29 22:38:09,082 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,083 - Modifying user kafka
> 2014-10-29 22:38:09,095 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,095 - Modifying user falcon
> 2014-10-29 22:38:09,108 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,108 - Modifying user sqoop
> 2014-10-29 22:38:09,121 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,122 - Modifying user yarn
> 2014-10-29 22:38:09,134 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2014-10-29 22:38:09,135 - Modifying user hcat
> 2014-10-29 22:38:09,147 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-10-29 22:38:09,149 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
> 2014-10-29 22:38:09,160 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
> 2014-10-29 22:38:09,161 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2014-10-29 22:38:09,162 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
> 2014-10-29 22:38:09,173 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if
> 2014-10-29 22:38:09,174 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
> 2014-10-29 22:38:09,174 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
> 2014-10-29 22:38:09,185 - Skipping Link['/etc/hadoop/conf'] due to not_if
> 2014-10-29 22:38:09,197 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
> 2014-10-29 22:38:09,207 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
> 2014-10-29 22:38:09,235 - Execute['mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so'] {}
> 2014-10-29 22:38:09,249 - Error while executing command 'start':
> Traceback (most recent call last):
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
> setup_hadoop()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
> install_snappy()
> File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
> format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
> self.env.run()
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
> self.run_action(resource, action)
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
> provider_action()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
> raise ex
> Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
> ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
>
>
>
> Looks like this happens if there is no hadoop related libary installed on the
> host.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/params.py fc66011
> ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/shared_initialization.py 81abf65
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/params.py e275924
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py f70eee8
> ambari-server/src/test/python/stacks/1.3.2/hooks/before-START/test_before_start.py d569a95
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 54b956e
>
> Diff: https://reviews.apache.org/r/27437/diff/
>
>
> Testing
> -------
>
> mvn clean test
>
>
> Thanks,
>
> Andrew Onischuk
>
>
Re: Review Request 27437: Install on a 5 node cluster fails with link
creation for libsnappy.
Posted by Andrew Onischuk <ao...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/27437/
-----------------------------------------------------------
(Updated Oct. 31, 2014, 6:43 p.m.)
Review request for Ambari, Myroslav Papirkovskyy and Vitalyi Brodetskyi.
Bugs: AMBARI-8076
https://issues.apache.org/jira/browse/AMBARI-8076
Repository: ambari
Description
-------
Install on a 5 node cluster fails with link creation for libsnappy.
2014-10-29 22:38:09,249 - Error while executing command 'start':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
setup_hadoop()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
install_snappy()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
raise ex
Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
stdout: /var/lib/ambari-agent/data/output-78.txt
2014-10-29 22:38:08,708 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2014-10-29 22:38:08,725 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/; curl -kf -x "" --retry 10 http://pt170-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
2014-10-29 22:38:08,726 - Group['hadoop'] {'ignore_failures': False}
2014-10-29 22:38:08,727 - Modifying group hadoop
2014-10-29 22:38:08,788 - Group['nobody'] {'ignore_failures': False}
2014-10-29 22:38:08,788 - Modifying group nobody
2014-10-29 22:38:08,824 - Group['users'] {'ignore_failures': False}
2014-10-29 22:38:08,825 - Modifying group users
2014-10-29 22:38:08,858 - Group['nagios'] {'ignore_failures': False}
2014-10-29 22:38:08,858 - Modifying group nagios
2014-10-29 22:38:08,891 - Group['knox'] {'ignore_failures': False}
2014-10-29 22:38:08,892 - Modifying group knox
2014-10-29 22:38:08,916 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
2014-10-29 22:38:08,916 - Modifying user nobody
2014-10-29 22:38:08,929 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,930 - Modifying user hive
2014-10-29 22:38:08,942 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2014-10-29 22:38:08,942 - Modifying user oozie
2014-10-29 22:38:08,955 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,955 - Modifying user nagios
2014-10-29 22:38:08,968 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2014-10-29 22:38:08,968 - Modifying user ambari-qa
2014-10-29 22:38:08,981 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,981 - Modifying user flume
2014-10-29 22:38:08,993 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:08,994 - Modifying user hdfs
2014-10-29 22:38:09,006 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,006 - Modifying user knox
2014-10-29 22:38:09,019 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,019 - Modifying user storm
2014-10-29 22:38:09,031 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,032 - Modifying user mapred
2014-10-29 22:38:09,044 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,044 - Modifying user hbase
2014-10-29 22:38:09,057 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2014-10-29 22:38:09,057 - Modifying user tez
2014-10-29 22:38:09,070 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,070 - Modifying user zookeeper
2014-10-29 22:38:09,082 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,083 - Modifying user kafka
2014-10-29 22:38:09,095 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,095 - Modifying user falcon
2014-10-29 22:38:09,108 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,108 - Modifying user sqoop
2014-10-29 22:38:09,121 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,122 - Modifying user yarn
2014-10-29 22:38:09,134 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2014-10-29 22:38:09,135 - Modifying user hcat
2014-10-29 22:38:09,147 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2014-10-29 22:38:09,149 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2014-10-29 22:38:09,160 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
2014-10-29 22:38:09,161 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2014-10-29 22:38:09,162 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if': 'test $(id -u hbase) -gt 1000'}
2014-10-29 22:38:09,173 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] due to not_if
2014-10-29 22:38:09,174 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2014-10-29 22:38:09,174 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2014-10-29 22:38:09,185 - Skipping Link['/etc/hadoop/conf'] due to not_if
2014-10-29 22:38:09,197 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs'}
2014-10-29 22:38:09,207 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
2014-10-29 22:38:09,235 - Execute['mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so'] {}
2014-10-29 22:38:09,249 - Error while executing command 'start':
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 122, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/hook.py", line 32, in hook
setup_hadoop()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 34, in setup_hadoop
install_snappy()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py", line 168, in install_snappy
format("mkdir -p {so_target_dir_x86}; ln -sf {so_src_x86} {so_target_x86}"))
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
raise ex
Fail: Execution of 'mkdir -p /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32; ln -sf /usr/hdp/current/hadoop-client/lib/libsnappy.so /usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so' returned 1. mkdir: cannot create directory `/usr/hdp/current/hadoop-client': File exists
ln: creating symbolic link `/usr/hdp/current/hadoop-client/lib/native/Linux-i386-32/libsnappy.so': No such file or directory
Looks like this happens if there is no hadoop related libary installed on the
host.
Diffs
-----
ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/params.py fc66011
ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/shared_initialization.py 81abf65
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/params.py e275924
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py f70eee8
ambari-server/src/test/python/stacks/1.3.2/hooks/before-START/test_before_start.py d569a95
ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 54b956e
Diff: https://reviews.apache.org/r/27437/diff/
Testing
-------
mvn clean test
Thanks,
Andrew Onischuk