You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Yesha Vora (JIRA)" <ji...@apache.org> on 2017/10/03 23:23:00 UTC
[jira] [Created] (AMBARI-22126) Regenerate keytab operation updates
livy.server.launch.kerberos.keytab incorrectly
Yesha Vora created AMBARI-22126:
-----------------------------------
Summary: Regenerate keytab operation updates livy.server.launch.kerberos.keytab incorrectly
Key: AMBARI-22126
URL: https://issues.apache.org/jira/browse/AMBARI-22126
Project: Ambari
Issue Type: Bug
Affects Versions: 2.6.0
Reporter: Yesha Vora
Scenario:
1) Install Ambari-2.5.0 and HDP 2.6.0
Livy.conf has livy.server.launch.kerberos.keytab set to /etc/security/keytabs/livy2.service.keytab.
/etc/security/keytabs/livy2.service.keytab is present on host.
2) Upgrade Ambari to 2.6.0
3) Regenerate keytab for missing components
4) Restart services with Stale configs
Here, Livy start operation fails because it modified livy.server.launch.kerberos.keytab to /etc/security/keytabs/livy.service.keytab.
livy.service.keytab file is not present on Host
{code}
stderr: /var/lib/ambari-agent/data/errors-731.txt
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py", line 144, in <module>
LivyServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 350, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py", line 59, in start
self.wait_for_dfs_directories_created([params.entity_groupfs_store_dir, params.entity_groupfs_active_dir])
File "/var/lib/ambari-agent/cache/common-services/SPARK2/2.0.0/package/scripts/livy2_server.py", line 84, in wait_for_dfs_directories_created
user=params.livy2_user
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/kinit -kt /etc/security/keytabs/livy.service.keytab livy/xxx@EXAMPLE.COM' returned 1. kinit: Key table file '/etc/security/keytabs/livy.service.keytab' not found while getting initial credentials
stdout: /var/lib/ambari-agent/data/output-731.txt
2017-10-03 19:10:39,638 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.0.3-8 -> 2.6.0.3-8
2017-10-03 19:10:39,641 - Using hadoop conf dir: /usr/hdp/2.6.0.3-8/hadoop/conf
2017-10-03 19:10:39,987 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=2.6.0.3-8 -> 2.6.0.3-8
2017-10-03 19:10:39,988 - Using hadoop conf dir: /usr/hdp/2.6.0.3-8/hadoop/conf
2017-10-03 19:10:39,989 - Group['livy'] {}
2017-10-03 19:10:39,990 - Group['spark'] {}
2017-10-03 19:10:39,990 - Group['hdfs'] {}
2017-10-03 19:10:39,991 - Group['hadoop'] {}
2017-10-03 19:10:39,991 - Group['users'] {}
2017-10-03 19:10:39,992 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:39,993 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:39,994 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:39,995 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2017-10-03 19:10:39,996 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:39,997 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2017-10-03 19:10:39,998 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:39,999 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:40,000 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users'], 'uid': None}
2017-10-03 19:10:40,001 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:40,002 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:40,003 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2017-10-03 19:10:40,004 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:40,005 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:40,006 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:40,007 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2017-10-03 19:10:40,008 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-03 19:10:40,010 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-10-03 19:10:40,034 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2017-10-03 19:10:40,035 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-10-03 19:10:40,036 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-03 19:10:40,038 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-10-03 19:10:40,039 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2017-10-03 19:10:40,065 - call returned (0, '1002')
2017-10-03 19:10:40,066 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-10-03 19:10:40,082 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1002'] due to not_if
2017-10-03 19:10:40,084 - Group['hdfs'] {}
2017-10-03 19:10:40,085 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hdfs']}
2017-10-03 19:10:40,086 - FS Type:
2017-10-03 19:10:40,087 - Directory['/etc/hadoop'] {'mode': 0755}
2017-10-03 19:10:40,112 - File['/usr/hdp/2.6.0.3-8/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2017-10-03 19:10:40,113 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-10-03 19:10:40,154 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2017-10-03 19:10:40,194 - Directory['/grid/0/log/hdfs'] {'owner': 'root', 'create_parents': True, 'group': 'hadoop', 'mode': 0775, 'cd_access': 'a'}
2017-10-03 19:10:40,195 - Directory['/var/run/hadoop'] {'owner': 'root', 'create_parents': True, 'group': 'root', 'cd_access': 'a'}
2017-10-03 19:10:40,196 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'create_parents': True, 'cd_access': 'a'}
2017-10-03 19:10:40,201 - File['/usr/hdp/2.6.0.3-8/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2017-10-03 19:10:40,203 - File['/usr/hdp/2.6.0.3-8/hadoop/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'root'}
2017-10-03 19:10:40,209 - File['/usr/hdp/2.6.0.3-8/hadoop/conf/log4j.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2017-10-03 19:10:40,219 - File['/usr/hdp/2.6.0.3-8/hadoop/conf/hadoop-metrics2.properties'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2017-10-03 19:10:40,219 - File['/usr/hdp/2.6.0.3-8/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2017-10-03 19:10:40,220 - File['/usr/hdp/2.6.0.3-8/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2017-10-03 19:10:40,225 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop', 'mode': 0644}
2017-10-03 19:10:40,248 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2017-10-03 19:10:40,833 - Using hadoop conf dir: /usr/hdp/2.6.0.3-8/hadoop/conf
2017-10-03 19:10:40,837 - Verifying DFS directories where ATS stores time line data for active and completed applications.
2017-10-03 19:10:40,837 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/livy.service.keytab livy/xxx@EXAMPLE.COM'] {'user': 'livy'}
Command failed after 1 tries{code}
Regenerate keytabs should not modify livy.server.launch.kerberos.keytab property
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)