You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Hadoop QA (JIRA)" <ji...@apache.org> on 2015/12/05 00:00:12 UTC

[jira] [Commented] (AMBARI-14217) RU: Spark install failed after upgrade

    [ https://issues.apache.org/jira/browse/AMBARI-14217?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15042376#comment-15042376 ] 

Hadoop QA commented on AMBARI-14217:
------------------------------------

{color:green}+1 overall{color}.  Here are the results of testing the latest attachment 
  http://issues.apache.org/jira/secure/attachment/12775790/AMBARI-14217.patch
  against trunk revision .

    {color:green}+1 @author{color}.  The patch does not contain any @author tags.

    {color:green}+1 tests included{color}.  The patch appears to include 1 new or modified test files.

    {color:green}+1 javac{color}.  The applied patch does not increase the total number of javac compiler warnings.

    {color:green}+1 release audit{color}.  The applied patch does not increase the total number of release audit warnings.

    {color:green}+1 core tests{color}.  The patch passed unit tests in .

Test results: https://builds.apache.org/job/Ambari-trunk-test-patch/4493//testReport/
Console output: https://builds.apache.org/job/Ambari-trunk-test-patch/4493//console

This message is automatically generated.

> RU: Spark install failed after upgrade
> --------------------------------------
>
>                 Key: AMBARI-14217
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14217
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.2.0
>            Reporter: Dmitry Lysnichenko
>            Assignee: Dmitry Lysnichenko
>             Fix For: 2.2.0
>
>         Attachments: AMBARI-14217.patch
>
>
> After performing Rolling Upgrade tried to add Spark to cluster.
> Failed with error:
> {code}
> Traceback (most recent call last):
> File \"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py\", line 59, in <module>
> SparkClient().execute()
> File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\", line 217, in execute
> method(env)
> File \"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py\", line 34, in install
> self.install_packages(env)
> File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\", line 393, in install_packages
> Package(name)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\", line 154, in __init__
> self.env.run()
> File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 158, in run
> self.run_action(resource, action)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 121, in run_action
> provider_action()
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\", line 49, in action_install
> self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/zypper.py\", line 57, in install_package
> self.checked_call_until_not_locked(cmd, sudo=True, logoutput=self.get_logoutput())
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\", line 72, in checked_call_until_not_locked
> return self.wait_until_not_locked(cmd, is_checked=True, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\", line 80, in wait_until_not_locked
> code, out = func(cmd, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 70, in inner
> result = function(command, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 92, in checked_call
> tries=tries, try_sleep=try_sleep)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 140, in _call_wrapper
> result = _call(command, **kwargs_copy)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 291, in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/zypper --quiet install --auto-agree-with-licenses --no-confirm 'spark_2_3_*'' returned 4. The following NEW packages are going to be installed:
> spark_2_3_0_0_2557 spark_2_3_0_0_2557-master spark_2_3_0_0_2557-python spark_2_3_0_0_2557-worker spark_2_3_4_0_3322 spark_2_3_4_0_3322-master spark_2_3_4_0_3322-python spark_2_3_4_0_3322-worker
> The following packages are not supported by their vendor:
> spark_2_3_0_0_2557 spark_2_3_0_0_2557-master spark_2_3_0_0_2557-python spark_2_3_0_0_2557-worker spark_2_3_4_0_3322 spark_2_3_4_0_3322-master spark_2_3_4_0_3322-python spark_2_3_4_0_3322-worker
> 8 new packages to install.
> Overall download size: 495.0 MiB. After the operation, additional 562.6 MiB will be used.
> Continue? [y/n/?] (y): y
> Permission to access 'http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm' denied.
> Abort, retry, ignore? [a/r/i/?] (a): a
> Failed to provide Package spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557. Do you want to retry retrieval?
> [HDP-2.3|http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/] Can't provide file './spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm' from repository 'HDP-2.3'
> History:
> - Permission to access 'http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm' denied.
> - Can't provide ./spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm
> Abort, retry, ignore? [a/r/i] (a): a
> Problem occured during or after installation or removal of packages:
> Installation aborted by user",
> "stdout" : "2015-11-25 15:53:22,997 - Group['spark'] {}
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)