You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Dmitry Lysnichenko (JIRA)" <ji...@apache.org> on 2015/12/17 17:00:49 UTC

[jira] [Updated] (AMBARI-14420) Atlas MetaData server start fails due to missing package while adding Atlas service after Hive service

     [ https://issues.apache.org/jira/browse/AMBARI-14420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dmitry Lysnichenko updated AMBARI-14420:
----------------------------------------
    Attachment: AMBARI-14420.patch

> Atlas MetaData server start fails due to missing package while adding Atlas service after Hive service
> ------------------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-14420
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14420
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>            Reporter: Dmitry Lysnichenko
>            Assignee: Dmitry Lysnichenko
>         Attachments: AMBARI-14420.patch
>
>
> *Steps:*
> # Install HDP 2.1 with Ambari 2.2.0 (cluster has Hive service installed, no Atlas)
> # Register HDP 2.3.4.0-3480 version and Install packages
> # Perform EU to HDP 2.3.4.0 and let it complete
> # Add Atlas service to the cluster
> Alternative STR:
> # Install HDP 2.1 with Ambari 2.2.0 (cluster has Hive service installed, no Atlas)
> # Add Atlas service to the cluster
> Result:
> Atlas MetaData server start fails with below error:
> {code}
> Traceback (most recent call last):
> File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata_server.py", line 132, in <module>
> MetadataServer().execute()
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata_server.py", line 53, in start
> self.configure(env)
> File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata_server.py", line 40, in configure
> metadata()
> File "/var/lib/ambari-agent/cache/common-services/ATLAS/0.1.0.2.3/package/scripts/metadata.py", line 69, in metadata
> content = StaticFile(format('{metadata_home}/server/webapp/atlas.war'))
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
> self.env.run()
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
> self.run_action(resource, action)
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
> provider_action()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 90, in action_create
> content = self._get_content()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 127, in _get_content
> return content()
> File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 51, in __call__
> return self.get_content()
> File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 75, in get_content
> raise Fail("{0} Source file {1} is not found".format(repr(self), path))
> resource_management.core.exceptions.Fail: StaticFile('/usr/hdp/current/atlas-server/server/webapp/atlas.war') Source file /usr/hdp/current/atlas-server/server/webapp/atlas.war is not found
> {code}
> Upon inspection found that:
> While installing packages in step 2, below package was installed along with hive packages:
> {code}
> 2015-12-16 20:05:15,043 - Package['atlas-metadata*-hive-plugin'] {'use_repos': ['HDP-2.3.4.0', 'HDP-UTILS-2.3.4.0'], 'skip_repos': ['HDP-*']}
> 2015-12-16 20:05:15,043 - Installing package atlas-metadata*-hive-plugin ('/usr/bin/yum -d 0 -e 0 -y install '--disablerepo=HDP-*' --enablerepo=HDP-2.3.4.0,HDP-UTILS-2.3.4.0 'atlas-metadata*-hive-plugin'')
> {code}
> Further while adding Atlas service in step 4, the installation of atlas packages are skipped because the system thinks that ATLAS packages are already present, which leads to the failure while starting MetaData server
> ATLAS MetaData server install log:
> {code}
> 2015-12-17 08:03:56,583 - Package['unzip'] {}
> 2015-12-17 08:03:56,723 - Skipping installation of existing package unzip
> 2015-12-17 08:03:56,723 - Package['curl'] {}
> 2015-12-17 08:03:56,747 - Skipping installation of existing package curl
> 2015-12-17 08:03:56,748 - Package['hdp-select'] {}
> 2015-12-17 08:03:56,765 - Skipping installation of existing package hdp-select
> 2015-12-17 08:03:57,047 - Package['atlas-metadata_2_3_*'] {}
> 2015-12-17 08:03:57,170 - Skipping installation of existing package atlas-metadata_2_3_*
> 2015-12-17 08:03:57,480 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.3.4.0-3480
> 2015-12-17 08:03:57,480 - Checking if need to create versioned conf dir /etc/hadoop/2.3.4.0-3480/0
> 2015-12-17 08:03:57,481 - call['conf-select create-conf-dir --package hadoop --stack-version 2.3.4.0-3480 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
> 2015-12-17 08:03:57,520 - call returned (1, '/etc/hadoop/2.3.4.0-3480/0 exist already', '')
> 2015-12-17 08:03:57,521 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.3.4.0-3480 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
> 2015-12-17 08:03:57,561 - checked_call returned (0, '/usr/hdp/2.3.4.0-3480/hadoop/conf -> /etc/hadoop/2.3.4.0-3480/0')
> {code}
> Note -
> a. There is a likelihood of this bug not coming if we chose a host for Atlas MetaData server that is different from any of the Hive hosts, while adding Atlas service.
> b. The bug is seen for EU from HDP 2.2 to 2.3 as well



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)