You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Aravindan Vijayan (JIRA)" <ji...@apache.org> on 2017/03/24 00:12:42 UTC

[jira] [Updated] (AMBARI-20553) Ambari script error for ams-hbase while writing to Amazon s3 on a cluster with no HDFS.

     [ https://issues.apache.org/jira/browse/AMBARI-20553?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aravindan Vijayan updated AMBARI-20553:
---------------------------------------
    Status: Patch Available  (was: Open)

> Ambari script error for ams-hbase while writing to Amazon s3 on a cluster with no HDFS.
> ---------------------------------------------------------------------------------------
>
>                 Key: AMBARI-20553
>                 URL: https://issues.apache.org/jira/browse/AMBARI-20553
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-metrics
>    Affects Versions: 2.5.1
>            Reporter: Aravindan Vijayan
>            Assignee: Aravindan Vijayan
>            Priority: Blocker
>             Fix For: 2.5.1
>
>         Attachments: AMBARI-20553.patch
>
>
> Metrics collector startup scripts fail to handle s3 paths in configurations:
> {noformat}
>       {
>         "hbase-site":{
>           "properties":{
>             "hbase.rootdir":"s3a://ss-datasets/apps/hbase/",
>             "hbase.wal.dir":"file:///usr/lib/ams-hbase/data"
>           }
>         }
> {noformat}
> {noformat}
> Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_collector.py", line 150, in <module>
>     AmsCollector().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 313, in execute
>     method(env)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 766, in restart
>     self.start(env)
>   File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_collector.py", line 48, in start
>     self.configure(env, action = 'start') # for security
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 116, in locking_configure
>     original_configure(obj, *args, **kw)
>   File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/metrics_collector.py", line 43, in configure
>     hbase('master', action)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>     return fn(*args, **kwargs)
>   File "/var/lib/ambari-agent/cache/common-services/AMBARI_METRICS/0.1.0/package/scripts/hbase.py", line 213, in hbase
>     dfs_type=params.dfs_type
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 555, in action_create_on_execute
>     self.action_delayed("create")
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 552, in action_delayed
>     self.get_hdfs_resource_executor().action_delayed(action_name, self)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 279, in action_delayed
>     self._assert_valid()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 238, in _assert_valid
>     self.target_status = self._get_file_status(target)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 381, in _get_file_status
>     list_status = self.util.run_command(target, 'GETFILESTATUS', method='GET', ignore_status_codes=['404'], assertable_result=False)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 186, in run_command
>     _, out, err = get_user_call_output(cmd, user=self.run_user, logoutput=self.logoutput, quiet=False)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/get_user_call_output.py", line 61, in get_user_call_output
>     raise ExecutionFailed(err_msg, code, files_output[0], files_output[1])
> resource_management.core.exceptions.ExecutionFailed: Execution of 'curl -sS -L -w '%{http_code}' -X GET 'http://<host>:50070/webhdfs/v1s3a:/ss-datasets/apps/hbase?op=GETFILESTATUS&user.name=hdfs' 1>/tmp/tmpjkn3uB 2>/tmp/tmpCVb8Kl' returned 7. curl: (7) Failed to connect to <host> port 50070: Connection refused
> 000
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)