You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Hadoop QA (JIRA)" <ji...@apache.org> on 2016/07/13 23:47:20 UTC

[jira] [Commented] (AMBARI-17697) Hive Restart Failed During RU Due To Missing SQL JAR

    [ https://issues.apache.org/jira/browse/AMBARI-17697?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15376015#comment-15376015 ] 

Hadoop QA commented on AMBARI-17697:
------------------------------------

{color:green}+1 overall{color}.  Here are the results of testing the latest attachment 
  http://issues.apache.org/jira/secure/attachment/12817810/AMBARI-17697.patch
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author tags.

    {color:green}+1 tests included{color}.  The patch appears to include 7 new or modified test files.

    {color:green}+1 javac{color}.  The applied patch does not increase the total number of javac compiler warnings.

    {color:green}+1 release audit{color}.  The applied patch does not increase the total number of release audit warnings.

    {color:green}+1 core tests{color}.  The patch passed unit tests in ambari-server.

Test results: https://builds.apache.org/job/Ambari-trunk-test-patch/7823//testReport/
Console output: https://builds.apache.org/job/Ambari-trunk-test-patch/7823//console

This message is automatically generated.

> Hive Restart Failed During RU Due To Missing SQL JAR
> ----------------------------------------------------
>
>                 Key: AMBARI-17697
>                 URL: https://issues.apache.org/jira/browse/AMBARI-17697
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.4.0
>            Reporter: Jonathan Hurley
>            Assignee: Jonathan Hurley
>            Priority: Critical
>             Fix For: 2.4.0
>
>         Attachments: AMBARI-17697.patch
>
>
> STR:
> # Deploy HDP 2.3 with Hive
> # Register and install HDP 2.5.0.0-945
> # Perform RU to 2.5.0.0-945 version
> # After Hive Upgrades, Downgrade
> {code}
> Traceback (most recent call last):
>   File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py\", line 211, in <module>
>     HiveServer().execute()
>   File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\", line 280, in execute
>     method(env)
>   File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\", line 709, in restart
>     self.start(env, upgrade_type=upgrade_type)
>   File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py\", line 85, in start
>     self.configure(env) # FOR SECURITY
>   File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py\", line 56, in configure
>     hive(name='hiveserver2')
>   File \"/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py\", line 89, in thunk
>     return fn(*args, **kwargs)
>   File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py\", line 277, in hive
>     jdbc_connector(params.hive2_jdbc_target, params.hive2_previous_jdbc_jar)
>   File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py\", line 467, in jdbc_connector
>     sudo = True)
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\", line 155, in __init__
>     self.env.run()
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 160, in run
>     self.run_action(resource, action)
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 124, in run_action
>     provider_action()
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py\", line 273, in action_run
>     tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 71, in inner
>     result = function(command, **kwargs)
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 93, in checked_call
>     tries=tries, try_sleep=try_sleep)
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 141, in _call_wrapper
>     result = _call(command, **kwargs_copy)
>   File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 294, in _call
>     raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of 'cp --remove-destination /var/lib/ambari-agent/tmp/mysql-connector-java.jar /usr/hdp/2.3.0.0-2557/hive2/lib/mysql-connector-java.jar' returned 1. cp: cannot create regular file '/usr/hdp/2.3.0.0-2557/hive2/lib/mysql-connector-java.jar': No such file or directory
> {code}
> I believe the problem is this line of code:
> {code}
> if check_stack_feature(StackFeature.HIVE_SERVER_INTERACTIVE, stack_version_unformatted):
> {code}
> On a downgrade from 2.5 to 2.3 {{version}} and {{current_version}} are pointing to 2.3, but stack_version_unformatted is still 2.5. Because of this, we initialize the hive2 variables which leads us to think that hive2 is the target of copying the JDBC:
> {code}
>   if name == 'metastore' or name == 'hiveserver2':
>     if params.hive_jdbc_target is not None and not os.path.exists(params.hive_jdbc_target):
>       jdbc_connector(params.hive_jdbc_target, params.hive_previous_jdbc_jar)
>     if params.hive2_jdbc_target is not None and not os.path.exists(params.hive2_jdbc_target):
>       jdbc_connector(params.hive2_jdbc_target, params.hive2_previous_jdbc_jar)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)