You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Hudson (JIRA)" <ji...@apache.org> on 2015/03/12 15:49:38 UTC
[jira] [Commented] (AMBARI-10041) WebHCat Server Start is failed.
(Umask -027)
[ https://issues.apache.org/jira/browse/AMBARI-10041?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14358750#comment-14358750 ]
Hudson commented on AMBARI-10041:
---------------------------------
ABORTED: Integrated in Ambari-branch-2.0.0 #58 (See [https://builds.apache.org/job/Ambari-branch-2.0.0/58/])
AMBARI-10041. WebHCat Server Start is failed. (Umask -027) (aonishuk) (aonishuk: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=b7a1e7174189be9e6f371f93795515c26a034c27)
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_service.py
* ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py
> WebHCat Server Start is failed. (Umask -027)
> --------------------------------------------
>
> Key: AMBARI-10041
> URL: https://issues.apache.org/jira/browse/AMBARI-10041
> Project: Ambari
> Issue Type: Bug
> Reporter: Andrew Onischuk
> Assignee: Andrew Onischuk
> Fix For: 2.0.0
>
>
> STR:
> 1)Deploy Cluster with all services except Kafka (Umask - 027)
> Expected result:
> ALL services are started.
> Actual result:
> WebHCat Server Start is failed.
>
>
>
> stderr: /var/lib/ambari-agent/data/errors-195.txt
>
> 2015-03-12 05:12:50,662 - Error while executing command 'start':
> Traceback (most recent call last):
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py", line 48, in start
> webhcat_service(action = 'start')
> File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_service.py", line 33, in webhcat_service
> not_if=no_op_test
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
> self.env.run()
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
> self.run_action(resource, action)
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
> provider_action()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 274, in action_run
> raise ex
> Fail: Execution of 'env HADOOP_HOME=/usr/hdp/current/hadoop-client /usr/hdp/current/hive-webhcat/sbin/webhcat_server.sh start' returned 1. /grid/0/hdp/2.2.2.0-2591/hive-hcatalog/sbin/webhcat_server.sh: already running on process 22165
> stdout: /var/lib/ambari-agent/data/output-195.txt
>
> 2015-03-12 05:12:24,294 - u"Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/']" {'recursive': True}
> 2015-03-12 05:12:24,486 - u"File['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']" {'content': DownloadSource('http://us-core-ubu12-8216-1.cs1cloud.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip')}
> 2015-03-12 05:12:24,592 - Not downloading the file from http://us-core-ubu12-8216-1.cs1cloud.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip, because /var/lib/ambari-agent/data/tmp/UnlimitedJCEPolicyJDK7.zip already exists
> 2015-03-12 05:12:24,750 - u"Group['hadoop']" {'ignore_failures': False}
> 2015-03-12 05:12:24,751 - Modifying group hadoop
> 2015-03-12 05:12:24,927 - u"Group['users']" {'ignore_failures': False}
> 2015-03-12 05:12:24,927 - Modifying group users
> 2015-03-12 05:12:25,021 - u"Group['knox']" {'ignore_failures': False}
> 2015-03-12 05:12:25,021 - Modifying group knox
> 2015-03-12 05:12:25,109 - u"Group['spark']" {'ignore_failures': False}
> 2015-03-12 05:12:25,109 - Modifying group spark
> 2015-03-12 05:12:25,318 - u"User['hive']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,318 - Modifying user hive
> 2015-03-12 05:12:25,366 - u"User['oozie']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-03-12 05:12:25,367 - Modifying user oozie
> 2015-03-12 05:12:25,418 - u"User['ambari-qa']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-03-12 05:12:25,418 - Modifying user ambari-qa
> 2015-03-12 05:12:25,468 - u"User['flume']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,468 - Modifying user flume
> 2015-03-12 05:12:25,524 - u"User['hdfs']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,525 - Modifying user hdfs
> 2015-03-12 05:12:25,581 - u"User['knox']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,582 - Modifying user knox
> 2015-03-12 05:12:25,643 - u"User['storm']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,644 - Modifying user storm
> 2015-03-12 05:12:25,703 - u"User['spark']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,704 - Modifying user spark
> 2015-03-12 05:12:25,757 - u"User['mapred']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,758 - Modifying user mapred
> 2015-03-12 05:12:25,808 - u"User['hbase']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,809 - Modifying user hbase
> 2015-03-12 05:12:25,859 - u"User['tez']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-03-12 05:12:25,859 - Modifying user tez
> 2015-03-12 05:12:25,911 - u"User['zookeeper']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,912 - Modifying user zookeeper
> 2015-03-12 05:12:25,962 - u"User['false']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:25,962 - Modifying user false
> 2015-03-12 05:12:26,014 - u"User['falcon']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:26,014 - Modifying user falcon
> 2015-03-12 05:12:26,065 - u"User['sqoop']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:26,066 - Modifying user sqoop
> 2015-03-12 05:12:26,116 - u"User['yarn']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:26,117 - Modifying user yarn
> 2015-03-12 05:12:26,181 - u"User['hcat']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:26,181 - Modifying user hcat
> 2015-03-12 05:12:26,241 - u"User['ams']" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-03-12 05:12:26,241 - Modifying user ams
> 2015-03-12 05:12:26,291 - u"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']" {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2015-03-12 05:12:26,626 - u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']" {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2015-03-12 05:12:26,675 - Skipping u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']" due to not_if
> 2015-03-12 05:12:26,675 - u"Directory['/grid/1/hadoop/hbase']" {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
> 2015-03-12 05:12:27,560 - u"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']" {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2015-03-12 05:12:27,946 - u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/1/hadoop/hbase']" {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
> 2015-03-12 05:12:28,001 - Skipping u"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/1/hadoop/hbase']" due to not_if
> 2015-03-12 05:12:28,002 - u"Group['hdfs']" {'ignore_failures': False}
> 2015-03-12 05:12:28,003 - Modifying group hdfs
> 2015-03-12 05:12:28,133 - u"User['hdfs']" {'ignore_failures': False, 'groups': [u'hadoop', 'users', 'hdfs', 'hadoop', u'hdfs']}
> 2015-03-12 05:12:28,133 - Modifying user hdfs
> 2015-03-12 05:12:28,206 - u"Directory['/etc/hadoop']" {'mode': 0755}
> 2015-03-12 05:12:28,388 - u"Directory['/etc/hadoop/conf.empty']" {'owner': 'root', 'group': 'hadoop', 'recursive': True}
> 2015-03-12 05:12:28,556 - u"Link['/etc/hadoop/conf']" {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
> 2015-03-12 05:12:28,613 - Skipping u"Link['/etc/hadoop/conf']" due to not_if
> 2015-03-12 05:12:28,627 - u"File['/etc/hadoop/conf/hadoop-env.sh']" {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
> 2015-03-12 05:12:28,920 - u"Execute['('setenforce', '0')']" {'sudo': True, 'only_if': 'test -f /selinux/enforce'}
> 2015-03-12 05:12:28,995 - Skipping u"Execute['('setenforce', '0')']" due to only_if
> 2015-03-12 05:12:28,995 - u"Directory['/grid/0/log/hadoop']" {'owner': 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'}
> 2015-03-12 05:12:29,610 - u"Directory['/var/run/hadoop']" {'owner': 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'}
> 2015-03-12 05:12:30,097 - u"Directory['/tmp/hadoop-hdfs']" {'owner': 'hdfs', 'recursive': True, 'cd_access': 'a'}
> 2015-03-12 05:12:30,476 - u"File['/etc/hadoop/conf/commons-logging.properties']" {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
> 2015-03-12 05:12:30,746 - u"File['/etc/hadoop/conf/health_check']" {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
> 2015-03-12 05:12:31,012 - u"File['/etc/hadoop/conf/log4j.properties']" {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
> 2015-03-12 05:12:31,286 - u"File['/etc/hadoop/conf/hadoop-metrics2.properties']" {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
> 2015-03-12 05:12:31,563 - u"File['/etc/hadoop/conf/task-log4j.properties']" {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
> 2015-03-12 05:12:32,104 - u"HdfsDirectory['/user/hcat']" {'security_enabled': False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '', 'mode': 0755, 'owner': 'hcat', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
> 2015-03-12 05:12:32,104 - u"HdfsDirectory['None']" {'security_enabled': False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '', 'action': ['create'], 'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
> 2015-03-12 05:12:32,106 - u"Execute['hadoop --config /etc/hadoop/conf fs -mkdir -p /user/hcat && hadoop --config /etc/hadoop/conf fs -chmod 755 /user/hcat && hadoop --config /etc/hadoop/conf fs -chown hcat /user/hcat']" {'not_if': "ambari-sudo.sh su hdfs -l -s /bin/bash -c 'hadoop --config /etc/hadoop/conf fs -ls /user/hcat'", 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2015-03-12 05:12:35,270 - Skipping u"Execute['hadoop --config /etc/hadoop/conf fs -mkdir -p /user/hcat && hadoop --config /etc/hadoop/conf fs -chmod 755 /user/hcat && hadoop --config /etc/hadoop/conf fs -chown hcat /user/hcat']" due to not_if
> 2015-03-12 05:12:35,270 - u"Directory['/var/run/webhcat']" {'owner': 'hcat', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
> 2015-03-12 05:12:35,433 - u"Directory['/grid/0/log/webhcat']" {'owner': 'hcat', 'group': 'hadoop', 'recursive': True, 'mode': 0755}
> 2015-03-12 05:12:35,599 - u"Directory['/etc/hive-webhcat/conf']" {'owner': 'hcat', 'group': 'hadoop', 'recursive': True}
> 2015-03-12 05:12:35,821 - u"ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.2.0-2591/hive/hive.tar.gz']" {'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}
> 2015-03-12 05:12:35,854 - u"Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.2.0-2591/hive/hive.tar.gz']" {'logoutput': True, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hcat', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> -r--r--r-- 3 hdfs hadoop 82989636 2015-03-12 04:56 hdfs:///hdp/apps/2.2.2.0-2591/hive/hive.tar.gz
> 2015-03-12 05:12:39,958 - u"ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.2.0-2591/pig/pig.tar.gz']" {'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}
> 2015-03-12 05:12:39,959 - u"Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.2.0-2591/pig/pig.tar.gz']" {'logoutput': True, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hcat', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> -r--r--r-- 3 hdfs hadoop 97532554 2015-03-12 04:57 hdfs:///hdp/apps/2.2.2.0-2591/pig/pig.tar.gz
> 2015-03-12 05:12:43,171 - u"ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.2.0-2591/mapreduce/hadoop-streaming.jar']" {'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}
> 2015-03-12 05:12:43,172 - u"Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.2.0-2591/mapreduce/hadoop-streaming.jar']" {'logoutput': True, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hcat', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> -r--r--r-- 3 hdfs hadoop 104999 2015-03-12 04:58 hdfs:///hdp/apps/2.2.2.0-2591/mapreduce/hadoop-streaming.jar
> 2015-03-12 05:12:46,569 - u"ExecuteHadoop['fs -ls hdfs:///hdp/apps/2.2.2.0-2591/sqoop/sqoop.tar.gz']" {'logoutput': True, 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}
> 2015-03-12 05:12:46,571 - u"Execute['hadoop --config /etc/hadoop/conf fs -ls hdfs:///hdp/apps/2.2.2.0-2591/sqoop/sqoop.tar.gz']" {'logoutput': True, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hcat', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> -r--r--r-- 3 hdfs hadoop 5495240 2015-03-12 04:58 hdfs:///hdp/apps/2.2.2.0-2591/sqoop/sqoop.tar.gz
> 2015-03-12 05:12:49,559 - u"XmlConfig['webhcat-site.xml']" {'owner': 'hcat', 'group': 'hadoop', 'conf_dir': '/etc/hive-webhcat/conf', 'configuration_attributes': {}, 'configurations': ...}
> 2015-03-12 05:12:49,571 - Generating config: /etc/hive-webhcat/conf/webhcat-site.xml
> 2015-03-12 05:12:49,572 - u"File['/etc/hive-webhcat/conf/webhcat-site.xml']" {'owner': 'hcat', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
> 2015-03-12 05:12:49,787 - Writing u"File['/etc/hive-webhcat/conf/webhcat-site.xml']" because contents don't match
> 2015-03-12 05:12:49,949 - u"File['/etc/hive-webhcat/conf/webhcat-env.sh']" {'content': InlineTemplate(...), 'owner': 'hcat', 'group': 'hadoop'}
> 2015-03-12 05:12:50,203 - u"File['/etc/hive-webhcat/conf/webhcat-log4j.properties']" {'content': '...', 'owner': 'hcat', 'group': 'hadoop', 'mode': 0644}
> 2015-03-12 05:12:50,463 - u"Execute['env HADOOP_HOME=/usr/hdp/current/hadoop-client /usr/hdp/current/hive-webhcat/sbin/webhcat_server.sh start']" {'not_if': 'ls /var/run/webhcat/webhcat.pid >/dev/null 2>&1 && ps -p `cat /var/run/webhcat/webhcat.pid` >/dev/null 2>&1', 'user': 'hcat'}
> 2015-03-12 05:12:50,662 - Error while executing command 'start':
> Traceback (most recent call last):
> File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
> method(env)
> File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py", line 48, in start
> webhcat_service(action = 'start')
> File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_service.py", line 33, in webhcat_service
> not_if=no_op_test
> File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
> self.env.run()
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
> self.run_action(resource, action)
> File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
> provider_action()
> File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 274, in action_run
> raise ex
> Fail: Execution of 'env HADOOP_HOME=/usr/hdp/current/hadoop-client /usr/hdp/current/hive-webhcat/sbin/webhcat_server.sh start' returned 1. /grid/0/hdp/2.2.2.0-2591/hive-hcatalog/sbin/webhcat_server.sh: already running on process 22165
> 2015-03-12 05:12:50,711 - Command: /usr/bin/hdp-select status hive-webhcat > /tmp/tmparyY8b
> Output: hive-webhcat - 2.2.2.0-2591
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)