You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Hudson (JIRA)" <ji...@apache.org> on 2015/05/05 16:29:00 UTC
[jira] [Commented] (AMBARI-10928) After enabling security start
services failed at Spark Client Install
[ https://issues.apache.org/jira/browse/AMBARI-10928?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14528518#comment-14528518 ]
Hudson commented on AMBARI-10928:
---------------------------------
FAILURE: Integrated in Ambari-trunk-Commit #2515 (See [https://builds.apache.org/job/Ambari-trunk-Commit/2515/])
AMBARI-10928. After enabling security start services failed at Spark Client Install (aonishuk) (aonishuk: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=c22791cbdf6a7458831c3b7d744e7b457e264797)
* ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/params.py
> After enabling security start services failed at Spark Client Install
> ---------------------------------------------------------------------
>
> Key: AMBARI-10928
> URL: https://issues.apache.org/jira/browse/AMBARI-10928
> Project: Ambari
> Issue Type: Bug
> Reporter: Andrew Onischuk
> Assignee: Andrew Onischuk
> Fix For: 2.1.0
>
>
> After enabling security Start services fails at 'Spark Client Install' with
> the below error.
> Here is a cluster where issue was found: <http://172.22.70.26:8080>.
> (Should be available for 10 more hours). please help take a look.
>
>
>
> Traceback (most recent call last):
> File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py", line 55, in <module>
> SparkClient().execute()
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py", line 33, in install
> self.configure(env)
> File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py", line 36, in configure
> import params
> File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/params.py", line 143, in <module>
> 'hive.server2.enable.doAs': str(config['configurations']['hive-site']['hive.server2.enable.doAs']).lower()
> TypeError: unsupported operand type(s) for +=: 'dict' and 'dict'
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)