You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Matt (JIRA)" <ji...@apache.org> on 2016/11/09 20:06:59 UTC

[jira] [Updated] (AMBARI-18837) HAWQ Master fails to start when webhdfs is disabled

     [ https://issues.apache.org/jira/browse/AMBARI-18837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Matt updated AMBARI-18837:
--------------------------
    Attachment: AMBARI-18837-orig.patch

> HAWQ Master fails to start when webhdfs is disabled
> ---------------------------------------------------
>
>                 Key: AMBARI-18837
>                 URL: https://issues.apache.org/jira/browse/AMBARI-18837
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Matt
>            Assignee: Matt
>             Fix For: trunk, 2.5.0, 2.4.2
>
>         Attachments: AMBARI-18837-orig.patch
>
>
> The HdfsResource is missing hadoop_conf_dir and hadoop_bin_dir parameters which are required when webhdfs is not enabled.
> {code}
> stderr:   /var/lib/ambari-agent/data/errors-22205.txt
> Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HAWQ/2.0.0/package/scripts/hawqmaster.py", line 98, in <module>
>     HawqMaster().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HAWQ/2.0.0/package/scripts/hawqmaster.py", line 57, in start
>     common.start_component(hawq_constants.MASTER, params.hawq_master_address_port, params.hawq_master_dir)
>   File "/var/lib/ambari-agent/cache/common-services/HAWQ/2.0.0/package/scripts/common.py", line 292, in start_component
>     params.HdfsResource(None, action="execute")
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 469, in action_execute
>     self.get_hdfs_resource_executor().action_execute(self)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 124, in action_execute
>     logoutput=logoutput,
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 238, in action_run
>     tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
>     result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
>     tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
>     result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 180, in _call
>     path = os.pathsep.join(path) if isinstance(path, (list, tuple)) else path
> TypeError: sequence item 0: expected string, NoneType found
> stdout:   /var/lib/ambari-agent/data/output-22205.txt
> 2016-11-02 19:27:04,973 - HdfsResource['/hawq_default'] {'security_enabled': False, 'keytab': [EMPTY], 'default_fs': 'hdfs://CentralPerk2NNSrvc', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'recursive_chown': True, 'owner': 'gpadmin', 'group': 'gpadmin', 'type': 'directory', 'action': ['create_on_execute'], 'immutable_paths': [u'/data/hive/databases', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon'], 'mode': 0755}
> 2016-11-02 19:27:04,978 - HdfsResource[None] {'security_enabled': False, 'keytab': [EMPTY], 'default_fs': 'hdfs://CentralPerk2NNSrvc', 'hdfs_site': ..., 'kinit_path_local': '/usr/bin/kinit', 'principal_name': [EMPTY], 'user': 'hdfs', 'action': ['execute'], 'immutable_paths': [u'/data/hive/databases', u'/tmp', u'/app-logs', u'/mr-history/done', u'/apps/falcon']}
> 2016-11-02 19:27:04,980 - File['/var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json'] {'content': '[{"group": "gpadmin", "target": "/hawq_default", "action": "create", "manageIfExists": true, "mode": "755", "owner": "gpadmin", "type": "directory", "recursiveChown": true}]', 'owner': 'hdfs'}
> 2016-11-02 19:27:04,982 - Writing File['/var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json'] because it doesn't exist
> 2016-11-02 19:27:04,983 - Changing owner for /var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json from 0 to hdfs
> 2016-11-02 19:27:04,984 - Execute['hadoop --config None jar /var/lib/ambari-agent/lib/fast-hdfs-resource.jar /var/lib/ambari-agent/tmp/hdfs_resources_1478129224.98.json'] {'logoutput': None, 'path': [None], 'user': 'hdfs'}
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)