You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Zack Marsh (JIRA)" <ji...@apache.org> on 2015/04/29 23:20:06 UTC

[jira] [Updated] (AMBARI-10849) HBase Client Install fails after enabling Kerberos

     [ https://issues.apache.org/jira/browse/AMBARI-10849?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Zack Marsh updated AMBARI-10849:
--------------------------------
    Description: 
After enabling Kerberos via the Kerberos Wizard in Ambari, the "Start and Test Services" step is failing.
The operation "HBase Client Install" is failing with the following output as seen in Ambari:

stderr:
{code}
2015-04-29 14:05:13,322 - Error while executing command 'install':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 31, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 34, in configure
    import params
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params.py", line 26, in <module>
    from params_linux import *
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params_linux.py", line 127, in <module>
    queryserver_jaas_princ = config['configurations']['hbase-site']['phoenix.queryserver.kerberos.principal'].replace('_HOST',_hostname_lowercase)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 79, in __getattr__
    raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
Fail: Configuration parameter 'phoenix.queryserver.kerberos.principal' was not found in configurations dictionary!
{code}

stdout:
{code}
2015-04-29 14:05:03,145 - Group['hadoop'] {'ignore_failures': False}
2015-04-29 14:05:03,150 - Modifying group hadoop
2015-04-29 14:05:03,247 - Group['users'] {'ignore_failures': False}
2015-04-29 14:05:03,248 - Modifying group users
2015-04-29 14:05:03,298 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,299 - Modifying user hive
2015-04-29 14:05:03,433 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,434 - Modifying user zookeeper
2015-04-29 14:05:03,496 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-29 14:05:03,497 - Modifying user oozie
2015-04-29 14:05:03,588 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,589 - Modifying user ams
2015-04-29 14:05:03,652 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,655 - Modifying user falcon
2015-04-29 14:05:03,757 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-29 14:05:03,760 - Modifying user tez
2015-04-29 14:05:03,866 - User['mahout'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,867 - Modifying user mahout
2015-04-29 14:05:03,939 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-29 14:05:03,940 - Modifying user ambari-qa
2015-04-29 14:05:04,049 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,054 - Modifying user flume
2015-04-29 14:05:04,186 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,187 - Modifying user hdfs
2015-04-29 14:05:04,251 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,252 - Modifying user sqoop
2015-04-29 14:05:04,312 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,315 - Modifying user yarn
2015-04-29 14:05:04,500 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,501 - Modifying user mapred
2015-04-29 14:05:04,633 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,635 - Modifying user hbase
2015-04-29 14:05:04,699 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,700 - Modifying user hcat
2015-04-29 14:05:04,830 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-29 14:05:05,137 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2015-04-29 14:05:05,189 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2015-04-29 14:05:05,190 - Directory['/var/opt/teradata/hadoop/hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2015-04-29 14:05:05,889 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-29 14:05:06,336 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/var/opt/teradata/hadoop/hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2015-04-29 14:05:06,384 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/var/opt/teradata/hadoop/hbase'] due to not_if
2015-04-29 14:05:06,385 - Group['hdfs'] {'ignore_failures': False}
2015-04-29 14:05:06,386 - Modifying group hdfs
2015-04-29 14:05:06,442 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', 'hdfs', u'hdfs']}
2015-04-29 14:05:06,443 - Modifying user hdfs
2015-04-29 14:05:06,556 - Directory['/etc/hadoop'] {'mode': 0755}
2015-04-29 14:05:06,715 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'hadoop', 'recursive': True}
2015-04-29 14:05:06,869 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-04-29 14:05:06,930 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-04-29 14:05:06,991 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2015-04-29 14:05:07,290 - Repository['HDP-2.3'] {'base_url': 'http://jolokia1.labs.teradata.com/HDP/suse11sp3/2.x/updates/2.3.0.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
2015-04-29 14:05:07,308 - File['/etc/zypp/repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-29 14:05:07,665 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://jolokia1.labs.teradata.com/HDP-UTILS-1.1.0.20/repos/suse11sp3', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2015-04-29 14:05:07,672 - File['/etc/zypp/repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-29 14:05:07,932 - Package['unzip'] {}
2015-04-29 14:05:08,851 - Skipping installation of existing package unzip
2015-04-29 14:05:08,852 - Package['curl'] {}
2015-04-29 14:05:09,722 - Skipping installation of existing package curl
2015-04-29 14:05:09,723 - Package['hdp-select'] {}
2015-04-29 14:05:10,559 - Skipping installation of existing package hdp-select
2015-04-29 14:05:11,011 - Package['hbase_2_3_*'] {}
2015-04-29 14:05:12,199 - Skipping installation of existing package hbase_2_3_*
2015-04-29 14:05:12,200 - Package['phoenix_2_3_*'] {}
2015-04-29 14:05:13,259 - Skipping installation of existing package phoenix_2_3_*
2015-04-29 14:05:13,322 - Error while executing command 'install':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 31, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 34, in configure
    import params
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params.py", line 26, in <module>
    from params_linux import *
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params_linux.py", line 127, in <module>
    queryserver_jaas_princ = config['configurations']['hbase-site']['phoenix.queryserver.kerberos.principal'].replace('_HOST',_hostname_lowercase)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 79, in __getattr__
    raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
Fail: Configuration parameter 'phoenix.queryserver.kerberos.principal' was not found in configurations dictionary!
{code}

Note: Phoenix is not enabled on this cluster

  was:
After enabling Kerberos via the Kerberos Wizard in Ambari, the "Start and Test Services" step is failing.
The operation "HBase Client Install" is failing with the following output as seen in Ambari:

stderr:
{code}
2015-04-29 14:05:13,322 - Error while executing command 'install':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 31, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 34, in configure
    import params
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params.py", line 26, in <module>
    from params_linux import *
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params_linux.py", line 127, in <module>
    queryserver_jaas_princ = config['configurations']['hbase-site']['phoenix.queryserver.kerberos.principal'].replace('_HOST',_hostname_lowercase)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 79, in __getattr__
    raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
Fail: Configuration parameter 'phoenix.queryserver.kerberos.principal' was not found in configurations dictionary!
{code}

stdout:
{code}
2015-04-29 14:05:03,145 - Group['hadoop'] {'ignore_failures': False}
2015-04-29 14:05:03,150 - Modifying group hadoop
2015-04-29 14:05:03,247 - Group['users'] {'ignore_failures': False}
2015-04-29 14:05:03,248 - Modifying group users
2015-04-29 14:05:03,298 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,299 - Modifying user hive
2015-04-29 14:05:03,433 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,434 - Modifying user zookeeper
2015-04-29 14:05:03,496 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-29 14:05:03,497 - Modifying user oozie
2015-04-29 14:05:03,588 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,589 - Modifying user ams
2015-04-29 14:05:03,652 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,655 - Modifying user falcon
2015-04-29 14:05:03,757 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-29 14:05:03,760 - Modifying user tez
2015-04-29 14:05:03,866 - User['mahout'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:03,867 - Modifying user mahout
2015-04-29 14:05:03,939 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-04-29 14:05:03,940 - Modifying user ambari-qa
2015-04-29 14:05:04,049 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,054 - Modifying user flume
2015-04-29 14:05:04,186 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,187 - Modifying user hdfs
2015-04-29 14:05:04,251 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,252 - Modifying user sqoop
2015-04-29 14:05:04,312 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,315 - Modifying user yarn
2015-04-29 14:05:04,500 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,501 - Modifying user mapred
2015-04-29 14:05:04,633 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,635 - Modifying user hbase
2015-04-29 14:05:04,699 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-04-29 14:05:04,700 - Modifying user hcat
2015-04-29 14:05:04,830 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-29 14:05:05,137 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2015-04-29 14:05:05,189 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2015-04-29 14:05:05,190 - Directory['/var/opt/teradata/hadoop/hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
2015-04-29 14:05:05,889 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-04-29 14:05:06,336 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/var/opt/teradata/hadoop/hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2015-04-29 14:05:06,384 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/var/opt/teradata/hadoop/hbase'] due to not_if
2015-04-29 14:05:06,385 - Group['hdfs'] {'ignore_failures': False}
2015-04-29 14:05:06,386 - Modifying group hdfs
2015-04-29 14:05:06,442 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', 'hdfs', u'hdfs']}
2015-04-29 14:05:06,443 - Modifying user hdfs
2015-04-29 14:05:06,556 - Directory['/etc/hadoop'] {'mode': 0755}
2015-04-29 14:05:06,715 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'hadoop', 'recursive': True}
2015-04-29 14:05:06,869 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-04-29 14:05:06,930 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-04-29 14:05:06,991 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
2015-04-29 14:05:07,290 - Repository['HDP-2.3'] {'base_url': 'http://jolokia1.labs.teradata.com/HDP/suse11sp3/2.x/updates/2.3.0.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
2015-04-29 14:05:07,308 - File['/etc/zypp/repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-29 14:05:07,665 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://jolokia1.labs.teradata.com/HDP-UTILS-1.1.0.20/repos/suse11sp3', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2015-04-29 14:05:07,672 - File['/etc/zypp/repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
2015-04-29 14:05:07,932 - Package['unzip'] {}
2015-04-29 14:05:08,851 - Skipping installation of existing package unzip
2015-04-29 14:05:08,852 - Package['curl'] {}
2015-04-29 14:05:09,722 - Skipping installation of existing package curl
2015-04-29 14:05:09,723 - Package['hdp-select'] {}
2015-04-29 14:05:10,559 - Skipping installation of existing package hdp-select
2015-04-29 14:05:11,011 - Package['hbase_2_3_*'] {}
2015-04-29 14:05:12,199 - Skipping installation of existing package hbase_2_3_*
2015-04-29 14:05:12,200 - Package['phoenix_2_3_*'] {}
2015-04-29 14:05:13,259 - Skipping installation of existing package phoenix_2_3_*
2015-04-29 14:05:13,322 - Error while executing command 'install':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 31, in install
    self.configure(env)
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 34, in configure
    import params
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params.py", line 26, in <module>
    from params_linux import *
  File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params_linux.py", line 127, in <module>
    queryserver_jaas_princ = config['configurations']['hbase-site']['phoenix.queryserver.kerberos.principal'].replace('_HOST',_hostname_lowercase)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 79, in __getattr__
    raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
Fail: Configuration parameter 'phoenix.queryserver.kerberos.principal' was not found in configurations dictionary!
{code}


> HBase Client Install fails after enabling Kerberos
> --------------------------------------------------
>
>                 Key: AMBARI-10849
>                 URL: https://issues.apache.org/jira/browse/AMBARI-10849
>             Project: Ambari
>          Issue Type: Bug
>         Environment: ambari-2.1.0-279, hdp-2.3.0.0-1778, sles11sp3
>            Reporter: Zack Marsh
>
> After enabling Kerberos via the Kerberos Wizard in Ambari, the "Start and Test Services" step is failing.
> The operation "HBase Client Install" is failing with the following output as seen in Ambari:
> stderr:
> {code}
> 2015-04-29 14:05:13,322 - Error while executing command 'install':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 31, in install
>     self.configure(env)
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 34, in configure
>     import params
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params.py", line 26, in <module>
>     from params_linux import *
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params_linux.py", line 127, in <module>
>     queryserver_jaas_princ = config['configurations']['hbase-site']['phoenix.queryserver.kerberos.principal'].replace('_HOST',_hostname_lowercase)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 79, in __getattr__
>     raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
> Fail: Configuration parameter 'phoenix.queryserver.kerberos.principal' was not found in configurations dictionary!
> {code}
> stdout:
> {code}
> 2015-04-29 14:05:03,145 - Group['hadoop'] {'ignore_failures': False}
> 2015-04-29 14:05:03,150 - Modifying group hadoop
> 2015-04-29 14:05:03,247 - Group['users'] {'ignore_failures': False}
> 2015-04-29 14:05:03,248 - Modifying group users
> 2015-04-29 14:05:03,298 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:03,299 - Modifying user hive
> 2015-04-29 14:05:03,433 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:03,434 - Modifying user zookeeper
> 2015-04-29 14:05:03,496 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-04-29 14:05:03,497 - Modifying user oozie
> 2015-04-29 14:05:03,588 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:03,589 - Modifying user ams
> 2015-04-29 14:05:03,652 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:03,655 - Modifying user falcon
> 2015-04-29 14:05:03,757 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-04-29 14:05:03,760 - Modifying user tez
> 2015-04-29 14:05:03,866 - User['mahout'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:03,867 - Modifying user mahout
> 2015-04-29 14:05:03,939 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-04-29 14:05:03,940 - Modifying user ambari-qa
> 2015-04-29 14:05:04,049 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:04,054 - Modifying user flume
> 2015-04-29 14:05:04,186 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:04,187 - Modifying user hdfs
> 2015-04-29 14:05:04,251 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:04,252 - Modifying user sqoop
> 2015-04-29 14:05:04,312 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:04,315 - Modifying user yarn
> 2015-04-29 14:05:04,500 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:04,501 - Modifying user mapred
> 2015-04-29 14:05:04,633 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:04,635 - Modifying user hbase
> 2015-04-29 14:05:04,699 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-04-29 14:05:04,700 - Modifying user hcat
> 2015-04-29 14:05:04,830 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2015-04-29 14:05:05,137 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
> 2015-04-29 14:05:05,189 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
> 2015-04-29 14:05:05,190 - Directory['/var/opt/teradata/hadoop/hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}
> 2015-04-29 14:05:05,889 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2015-04-29 14:05:06,336 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/var/opt/teradata/hadoop/hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
> 2015-04-29 14:05:06,384 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/var/opt/teradata/hadoop/hbase'] due to not_if
> 2015-04-29 14:05:06,385 - Group['hdfs'] {'ignore_failures': False}
> 2015-04-29 14:05:06,386 - Modifying group hdfs
> 2015-04-29 14:05:06,442 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', 'hdfs', u'hdfs']}
> 2015-04-29 14:05:06,443 - Modifying user hdfs
> 2015-04-29 14:05:06,556 - Directory['/etc/hadoop'] {'mode': 0755}
> 2015-04-29 14:05:06,715 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'hadoop', 'recursive': True}
> 2015-04-29 14:05:06,869 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
> 2015-04-29 14:05:06,930 - Skipping Link['/etc/hadoop/conf'] due to not_if
> 2015-04-29 14:05:06,991 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root', 'group': 'hadoop'}
> 2015-04-29 14:05:07,290 - Repository['HDP-2.3'] {'base_url': 'http://jolokia1.labs.teradata.com/HDP/suse11sp3/2.x/updates/2.3.0.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP', 'mirror_list': None}
> 2015-04-29 14:05:07,308 - File['/etc/zypp/repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
> 2015-04-29 14:05:07,665 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://jolokia1.labs.teradata.com/HDP-UTILS-1.1.0.20/repos/suse11sp3', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
> 2015-04-29 14:05:07,672 - File['/etc/zypp/repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
> 2015-04-29 14:05:07,932 - Package['unzip'] {}
> 2015-04-29 14:05:08,851 - Skipping installation of existing package unzip
> 2015-04-29 14:05:08,852 - Package['curl'] {}
> 2015-04-29 14:05:09,722 - Skipping installation of existing package curl
> 2015-04-29 14:05:09,723 - Package['hdp-select'] {}
> 2015-04-29 14:05:10,559 - Skipping installation of existing package hdp-select
> 2015-04-29 14:05:11,011 - Package['hbase_2_3_*'] {}
> 2015-04-29 14:05:12,199 - Skipping installation of existing package hbase_2_3_*
> 2015-04-29 14:05:12,200 - Package['phoenix_2_3_*'] {}
> 2015-04-29 14:05:13,259 - Skipping installation of existing package phoenix_2_3_*
> 2015-04-29 14:05:13,322 - Error while executing command 'install':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 31, in install
>     self.configure(env)
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/hbase_client.py", line 34, in configure
>     import params
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params.py", line 26, in <module>
>     from params_linux import *
>   File "/var/lib/ambari-agent/cache/common-services/HBASE/0.96.0.2.0/package/scripts/params_linux.py", line 127, in <module>
>     queryserver_jaas_princ = config['configurations']['hbase-site']['phoenix.queryserver.kerberos.principal'].replace('_HOST',_hostname_lowercase)
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 79, in __getattr__
>     raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
> Fail: Configuration parameter 'phoenix.queryserver.kerberos.principal' was not found in configurations dictionary!
> {code}
> Note: Phoenix is not enabled on this cluster



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)