You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Nate Cole (JIRA)" <ji...@apache.org> on 2015/08/26 22:49:05 UTC

[jira] [Updated] (AMBARI-8715) Server start script incorrectly uses rpm binary to determine hadoop version

     [ https://issues.apache.org/jira/browse/AMBARI-8715?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nate Cole updated AMBARI-8715:
------------------------------
    Fix Version/s:     (was: 2.1.1)
                   2.1.2

> Server start script incorrectly uses rpm binary to determine hadoop version
> ---------------------------------------------------------------------------
>
>                 Key: AMBARI-8715
>                 URL: https://issues.apache.org/jira/browse/AMBARI-8715
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 1.7.0
>         Environment: Ubuntu 12.04
>            Reporter: Peter Klavins
>            Priority: Minor
>              Labels: ubuntu
>             Fix For: 2.1.2
>
>         Attachments: AMBARI-8715.patch
>
>
> Starting services fails on an Ambari deployed Ubuntu 12.04 cluster because scripts use rpm to check for hadoop version, and rpm is not installed on Ubuntu 12.04.
> For example, starting hive gives this error (look at 'Fail' line for reference to rpm):
> 2014-12-15 11:37:26,909 - Error while executing command 'start':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 123, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive_server.py", line 43, in start
>     self.configure(env) # FOR SECURITY
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive_server.py", line 38, in configure
>     hive(name='hiveserver2')
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/hive.py", line 41, in hive
>     params.HdfsDirectory(None, action="create")
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_directory.py", line 107, in action_create
>     not_if=format("su - {hdp_hdfs_user} -c 'export PATH=$PATH:{bin_dir} ; "
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 241, in action_run
>     raise ex
> Fail: Execution of 'hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive' returned 1. -su: rpm: command not found
> mkdir: Call From master.node/192.168.10.10 to master.node:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
> mkdir: Call From master.node/192.168.10.10 to master.node:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)