You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Hudson (JIRA)" <ji...@apache.org> on 2015/05/13 13:31:59 UTC

[jira] [Commented] (AMBARI-11096) HiveServer2 Start fails during kerberization

    [ https://issues.apache.org/jira/browse/AMBARI-11096?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14541748#comment-14541748 ] 

Hudson commented on AMBARI-11096:
---------------------------------

FAILURE: Integrated in Ambari-trunk-Commit #2580 (See [https://builds.apache.org/job/Ambari-trunk-Commit/2580/])
AMBARI-11096. HiveServer2 Start fails during kerberization (aonishuk) (aonishuk: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=090d8b53291ed0e011cdab109eb3fa60cefbf7dc)
* ambari-server/src/main/resources/common-services/YARN/2.1.0.2.0/package/scripts/mapred_service_check.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py
* ambari-server/src/test/python/stacks/2.0.6/YARN/test_mapreduce2_service_check.py


> HiveServer2 Start fails during kerberization
> --------------------------------------------
>
>                 Key: AMBARI-11096
>                 URL: https://issues.apache.org/jira/browse/AMBARI-11096
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 2.1.0
>
>
> **Logs**
>     
>     
>     2015-05-13 02:04:55,162 - Group['hadoop'] {'ignore_failures': False}
>     2015-05-13 02:04:55,164 - Modifying group hadoop
>     2015-05-13 02:04:55,294 - Group['users'] {'ignore_failures': False}
>     2015-05-13 02:04:55,295 - Modifying group users
>     2015-05-13 02:04:55,377 - Group['knox'] {'ignore_failures': False}
>     2015-05-13 02:04:55,378 - Modifying group knox
>     2015-05-13 02:04:55,471 - Group['spark'] {'ignore_failures': False}
>     2015-05-13 02:04:55,472 - Modifying group spark
>     2015-05-13 02:04:55,554 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:55,555 - Modifying user hive
>     2015-05-13 02:04:55,647 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>     2015-05-13 02:04:55,648 - Modifying user oozie
>     2015-05-13 02:04:55,731 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>     2015-05-13 02:04:55,732 - Modifying user ambari-qa
>     2015-05-13 02:04:55,826 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:55,827 - Modifying user flume
>     2015-05-13 02:04:55,911 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:55,912 - Modifying user hdfs
>     2015-05-13 02:04:56,001 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,002 - Modifying user knox
>     2015-05-13 02:04:56,093 - User['spark'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,094 - Modifying user spark
>     2015-05-13 02:04:56,178 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,179 - Modifying user mapred
>     2015-05-13 02:04:56,263 - User['accumulo'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,264 - Modifying user accumulo
>     2015-05-13 02:04:56,355 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,356 - Modifying user hbase
>     2015-05-13 02:04:56,440 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>     2015-05-13 02:04:56,441 - Modifying user tez
>     2015-05-13 02:04:56,527 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,528 - Modifying user zookeeper
>     2015-05-13 02:04:56,622 - User['mahout'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,623 - Modifying user mahout
>     2015-05-13 02:04:56,706 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,707 - Modifying user sqoop
>     2015-05-13 02:04:56,792 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,793 - Modifying user yarn
>     2015-05-13 02:04:56,886 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,887 - Modifying user hcat
>     2015-05-13 02:04:56,980 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>     2015-05-13 02:04:56,981 - Modifying user ams
>     2015-05-13 02:04:57,065 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2015-05-13 02:04:57,602 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
>     2015-05-13 02:04:57,693 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
>     2015-05-13 02:04:57,694 - Directory['/grid/0/hadoop/hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
>     2015-05-13 02:04:58,745 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2015-05-13 02:04:59,418 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
>     2015-05-13 02:04:59,503 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase'] due to not_if
>     2015-05-13 02:04:59,505 - Group['hdfs'] {'ignore_failures': False}
>     2015-05-13 02:04:59,505 - Modifying group hdfs
>     2015-05-13 02:04:59,592 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', 'hadoop', 'hdfs', u'hdfs']}
>     2015-05-13 02:04:59,593 - Modifying user hdfs
>     2015-05-13 02:04:59,678 - Directory['/etc/hadoop'] {'mode': 0755}
>     2015-05-13 02:05:00,005 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
>     2015-05-13 02:05:00,517 - Execute['('setenforce', '0')'] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
>     2015-05-13 02:05:00,883 - Directory['/grid/0/log/hadoop'] {'owner': 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'}
>     2015-05-13 02:05:01,929 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'}
>     2015-05-13 02:05:02,782 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True, 'cd_access': 'a'}
>     2015-05-13 02:05:03,539 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
>     2015-05-13 02:05:04,091 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
>     2015-05-13 02:05:04,609 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
>     2015-05-13 02:05:05,149 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
>     2015-05-13 02:05:05,660 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
>     2015-05-13 02:05:06,268 - File['/usr/hdp/current/hadoop-client/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
>     2015-05-13 02:05:07,285 - HdfsResource['hdfs:///hdp/apps/2.3.0.0-1957/mapreduce//mapreduce.tar.gz'] {'security_enabled': True, 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'source': '/usr/hdp/current/hadoop-client/mapreduce.tar.gz', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'action': ['create_on_execute'], 'group': 'hadoop', 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'type': 'file', 'mode': 0444}
>     2015-05-13 02:05:07,287 - HdfsResource['/user/hcat'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'owner': 'hcat', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'mode': 0755}
>     2015-05-13 02:05:07,289 - HdfsResource['hdfs:///hdp/apps/2.3.0.0-1957/mapreduce//hadoop-streaming.jar'] {'security_enabled': True, 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'source': '/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming.jar', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'action': ['create_on_execute'], 'group': 'hadoop', 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'type': 'file', 'mode': 0444}
>     2015-05-13 02:05:07,292 - HdfsResource['hdfs:///hdp/apps/2.3.0.0-1957/pig//pig.tar.gz'] {'security_enabled': True, 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'source': '/usr/hdp/current/pig-client/pig.tar.gz', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'action': ['create_on_execute'], 'group': 'hadoop', 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'type': 'file', 'mode': 0444}
>     2015-05-13 02:05:07,294 - HdfsResource['hdfs:///hdp/apps/2.3.0.0-1957/hive//hive.tar.gz'] {'security_enabled': True, 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'source': '/usr/hdp/current/hive-client/hive.tar.gz', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'action': ['create_on_execute'], 'group': 'hadoop', 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'type': 'file', 'mode': 0444}
>     2015-05-13 02:05:07,301 - HdfsResource['hdfs:///hdp/apps/2.3.0.0-1957/sqoop//sqoop.tar.gz/sqoop.tar.gz'] {'security_enabled': True, 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'source': '/usr/hdp/current/sqoop-client/sqoop.tar.gz', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'action': ['create_on_execute'], 'group': 'hadoop', 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'type': 'file', 'mode': 0444}
>     2015-05-13 02:05:07,301 - HdfsResource['/apps/hive/warehouse'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'mode': 0777}
>     2015-05-13 02:05:07,302 - HdfsResource['/user/hive'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'owner': 'hive', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'mode': 0700}
>     2015-05-13 02:05:07,302 - HdfsResource['None'] {'security_enabled': True, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'kinit_path_local': '/usr/bin/kinit', 'user': 'hdfs@EXAMPLE.COM', 'action': ['execute'], 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf'}
>     2015-05-13 02:05:07,303 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs@EXAMPLE.COM'] {'user': 'hdfs@EXAMPLE.COM'}
>     2015-05-13 02:05:07,404 - Error while executing command 'start':
>     Traceback (most recent call last):
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py", line 71, in start
>         self.configure(env) # FOR SECURITY
>       File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive_server.py", line 42, in configure
>         hive(name='hiveserver2')
>       File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>         return fn(*args, **kwargs)
>       File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/hive.py", line 174, in hive
>         params.HdfsResource(None, action="execute")
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 101, in action_execute
>         user=user
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 269, in action_run
>         raise ex
>     Fail: Execution of '/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs@EXAMPLE.COM' returned 125. su: user hdfs@EXAMPLE.COM does not exist
>     



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)