You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Hudson (JIRA)" <ji...@apache.org> on 2015/04/10 12:55:13 UTC

[jira] [Commented] (AMBARI-10415) Sometimes Datanode start fails when using non-default umask

    [ https://issues.apache.org/jira/browse/AMBARI-10415?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14489374#comment-14489374 ] 

Hudson commented on AMBARI-10415:
---------------------------------

SUCCESS: Integrated in Ambari-trunk-Commit #2251 (See [https://builds.apache.org/job/Ambari-trunk-Commit/2251/])
AMBARI-10415. Sometimes Datanode start fails when using non-default umask (aonishuk) (aonishuk: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=b8d2992e454f62d988e9fb73a77c84fb49a9fa41)
* ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/shared_initialization.py
* ambari-server/src/test/python/stacks/2.0.6/hooks/before-INSTALL/test_before_install.py


> Sometimes Datanode start fails when using non-default umask
> -----------------------------------------------------------
>
>                 Key: AMBARI-10415
>                 URL: https://issues.apache.org/jira/browse/AMBARI-10415
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 2.1.0
>
>
> 1.Set umask to 0027  
> 2.Install ambari-server  
> 3.Deploy ambari  
> Datanode start fails with stderr :
>     
>     
>     2015-03-22 19:21:38,154 - Error while executing command 'start':\nTraceback (most recent call last):\n  File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\", line 214, in execute\n    method(env)\n  File \"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/datanode.py\", line 63, in start\n    datanode(action=\"start\")\n  File \"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_datanode.py\", line 61, in datanode\n    create_log_dir=True\n  File \"/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/utils.py\", line 219, in service\n    environment=hadoop_env_exports\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\", line 148, in __init__\n    self.env.run()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 152, in run\n    self.run_action(resource, action)\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line 118, in run_action\n    provider_action()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py\", line 274, in action_run\n    raise ex\nFail: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /etc/hadoop/conf start datanode'' returned 1. starting datanode, logging to /grid/0/log/hadoop/hdfs/hadoop-hdfs-datanode-hea-testooziehivedatabaseoptions-ubu12-4515-5.out\r\n/usr/hdp/2.2.4.0-2626//hadoop-hdfs/bin/hdfs.distro: line 281: /usr/jdk64/jdk1.7.0_67/bin/java: Permission denied\r\n/usr/hdp/2.2.4.0-2626//hadoop-hdfs/bin/hdfs.distro: line 281: exec: /usr/jdk64/jdk1.7.0_67/bin/java: cannot execute: Permission denied



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)