You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Jonathan Hurley (JIRA)" <ji...@apache.org> on 2014/06/30 21:15:25 UTC

[jira] [Created] (AMBARI-6320) Nagios failed to start on stack upgraded cluster

Jonathan Hurley created AMBARI-6320:
---------------------------------------

             Summary: Nagios failed to start on stack upgraded cluster
                 Key: AMBARI-6320
                 URL: https://issues.apache.org/jira/browse/AMBARI-6320
             Project: Ambari
          Issue Type: Bug
    Affects Versions: 1.6.1
         Environment: CentOS 6.4
            Reporter: Jonathan Hurley
            Assignee: Jonathan Hurley
            Priority: Blocker
             Fix For: 1.6.1


{code}
stderr:   /var/lib/ambari-agent/data/errors-356.txt

2014-06-27 13:00:29,322 - Error while executing command 'start':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server.py", line 47, in start
    self.configure(env) # done for updating configs after Security enabled
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server.py", line 38, in configure
    nagios()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios.py", line 62, in nagios
    nagios_server_config()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server_config.py", line 39, in nagios_server_config
    nagios_server_configfile( 'hadoop-services.cfg')
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server_config.py", line 87, in nagios_server_configfile
    mode           = mode
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/template_config.py", line 42, in action_create
    content = Template(template_name, extra_imports=self.resource.extra_imports)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 96, in action_create
    content = self._get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 136, in _get_content
    return content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 47, in __call__
    return self.get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 126, in get_content
    rendered = self.template.render(self.context)
  File "/usr/lib/python2.6/site-packages/jinja2/environment.py", line 891, in render
    return self.environment.handle_exception(exc_info, True)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/templates/hadoop-services.cfg.j2", line 424, in top-level template code
    check_command           check_hdfs_capacity!$HOSTGROUPMEMBERS:namenode$!{{ namenode_port }}!80%!90%!{{ str(hadoop_ssl_enabled).lower() }}!{{ nagios_keytab_path }}!{{ nagios_principal_name }}!{{ kinit_path_local }}!{{ str(security_enabled).lower() }}
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 75, in __getattr__
    raise Fail("Configuration parameter '"+self.name+"' was not found in configurations dictionary!")
Fail: Configuration parameter 'dfs.namenode.checkpoint.txns' was not found in configurations dictionary!
stdout:   /var/lib/ambari-agent/data/output-356.txt

2014-06-27 13:00:28,614 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf --retry 10     http://amb-upg160-6-4postgres1403872559-2.cs1cloud.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2014-06-27 13:00:28,626 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf --retry 10     http://amb-upg160-6-4postgres1403872559-2.cs1cloud.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
2014-06-27 13:00:28,745 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2014-06-27 13:00:28,747 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2014-06-27 13:00:28,759 - Skipping Link['/etc/hadoop/conf'] due to not_if
2014-06-27 13:00:28,775 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': Template('hadoop-env.sh.j2'), 'owner': 'hdfs'}
2014-06-27 13:00:28,776 - XmlConfig['core-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
2014-06-27 13:00:28,782 - Generating config: /etc/hadoop/conf/core-site.xml
2014-06-27 13:00:28,783 - File['/etc/hadoop/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:28,783 - Writing File['/etc/hadoop/conf/core-site.xml'] because contents don't match
2014-06-27 13:00:28,797 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
2014-06-27 13:00:28,809 - Skipping Execute['/bin/echo 0 > /selinux/enforce'] due to only_if
2014-06-27 13:00:28,811 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-i386-32; ln -sf /usr/lib/libsnappy.so /usr/lib/hadoop/lib/native/Linux-i386-32/libsnappy.so'] {}
2014-06-27 13:00:28,825 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-amd64-64; ln -sf /usr/lib64/libsnappy.so /usr/lib/hadoop/lib/native/Linux-amd64-64/libsnappy.so'] {}
2014-06-27 13:00:28,838 - Directory['/grid/0/log/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
2014-06-27 13:00:28,839 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
2014-06-27 13:00:28,839 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
2014-06-27 13:00:28,845 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2014-06-27 13:00:28,847 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
2014-06-27 13:00:28,848 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2014-06-27 13:00:28,853 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
2014-06-27 13:00:28,854 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2014-06-27 13:00:28,854 - File['/etc/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2014-06-27 13:00:29,112 - File['/etc/httpd/conf.d/nagios.conf'] {'content': Template('nagios.conf.j2'), 'owner': 'nagios', 'group': 'nagios', 'mode': 0644}
2014-06-27 13:00:29,113 - Writing File['/etc/httpd/conf.d/nagios.conf'] because contents don't match
2014-06-27 13:00:29,114 - Directory['/etc/nagios'] {'owner': 'nagios', 'group': 'nagios'}
2014-06-27 13:00:29,114 - Directory['/usr/lib64/nagios/plugins'] {}
2014-06-27 13:00:29,115 - Directory['/etc/nagios/objects'] {}
2014-06-27 13:00:29,115 - Directory['/var/run/nagios'] {'owner': 'nagios', 'group': 'nagios', 'recursive': True, 'mode': 0755}
2014-06-27 13:00:29,115 - Directory['/var/nagios'] {'owner': 'nagios', 'group': 'nagios', 'recursive': True}
2014-06-27 13:00:29,116 - Directory['/var/nagios/spool/checkresults'] {'owner': 'nagios', 'group': 'nagios', 'recursive': True}
2014-06-27 13:00:29,116 - Directory['/var/nagios/rw'] {'owner': 'nagios', 'group': 'nagios', 'recursive': True}
2014-06-27 13:00:29,117 - Directory['/var/log/nagios'] {'owner': 'nagios', 'group': 'nagios', 'mode': 0755}
2014-06-27 13:00:29,117 - Directory['/var/log/nagios/archives'] {'owner': 'nagios', 'group': 'nagios', 'mode': 0755}
2014-06-27 13:00:29,118 - TemplateConfig['/etc/nagios/nagios.cfg'] {'owner': 'nagios', 'group': 'nagios', 'mode': None}
2014-06-27 13:00:29,147 - File['/etc/nagios/nagios.cfg'] {'content': Template('nagios.cfg.j2'), 'owner': 'nagios', 'group': 'nagios', 'mode': None}
2014-06-27 13:00:29,148 - Writing File['/etc/nagios/nagios.cfg'] because contents don't match
2014-06-27 13:00:29,149 - TemplateConfig['/etc/nagios/resource.cfg'] {'owner': 'nagios', 'group': 'nagios', 'mode': None}
2014-06-27 13:00:29,151 - File['/etc/nagios/resource.cfg'] {'content': Template('resource.cfg.j2'), 'owner': 'nagios', 'group': 'nagios', 'mode': None}
2014-06-27 13:00:29,152 - Writing File['/etc/nagios/resource.cfg'] because contents don't match
2014-06-27 13:00:29,153 - TemplateConfig['/etc/nagios/objects/hadoop-hosts.cfg'] {'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,158 - File['/etc/nagios/objects/hadoop-hosts.cfg'] {'content': Template('hadoop-hosts.cfg.j2'), 'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,158 - Writing File['/etc/nagios/objects/hadoop-hosts.cfg'] because contents don't match
2014-06-27 13:00:29,159 - TemplateConfig['/etc/nagios/objects/hadoop-hostgroups.cfg'] {'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,166 - File['/etc/nagios/objects/hadoop-hostgroups.cfg'] {'content': Template('hadoop-hostgroups.cfg.j2'), 'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,166 - Writing File['/etc/nagios/objects/hadoop-hostgroups.cfg'] because contents don't match
2014-06-27 13:00:29,167 - TemplateConfig['/etc/nagios/objects/hadoop-servicegroups.cfg'] {'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,187 - File['/etc/nagios/objects/hadoop-servicegroups.cfg'] {'content': Template('hadoop-servicegroups.cfg.j2'), 'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,188 - Writing File['/etc/nagios/objects/hadoop-servicegroups.cfg'] because contents don't match
2014-06-27 13:00:29,189 - TemplateConfig['/etc/nagios/objects/hadoop-services.cfg'] {'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,315 - File['/etc/nagios/objects/hadoop-services.cfg'] {'content': Template('hadoop-services.cfg.j2'), 'owner': 'nagios', 'group': 'hadoop', 'mode': None}
2014-06-27 13:00:29,322 - Error while executing command 'start':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 111, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server.py", line 47, in start
    self.configure(env) # done for updating configs after Security enabled
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server.py", line 38, in configure
    nagios()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios.py", line 62, in nagios
    nagios_server_config()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server_config.py", line 39, in nagios_server_config
    nagios_server_configfile( 'hadoop-services.cfg')
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/scripts/nagios_server_config.py", line 87, in nagios_server_configfile
    mode           = mode
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/template_config.py", line 42, in action_create
    content = Template(template_name, extra_imports=self.resource.extra_imports)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 96, in action_create
    content = self._get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 136, in _get_content
    return content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 47, in __call__
    return self.get_content()
  File "/usr/lib/python2.6/site-packages/resource_management/core/source.py", line 126, in get_content
    rendered = self.template.render(self.context)
  File "/usr/lib/python2.6/site-packages/jinja2/environment.py", line 891, in render
    return self.environment.handle_exception(exc_info, True)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/NAGIOS/package/templates/hadoop-services.cfg.j2", line 424, in top-level template code
    check_command           check_hdfs_capacity!$HOSTGROUPMEMBERS:namenode$!{{ namenode_port }}!80%!90%!{{ str(hadoop_ssl_enabled).lower() }}!{{ nagios_keytab_path }}!{{ nagios_principal_name }}!{{ kinit_path_local }}!{{ str(security_enabled).lower() }}
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 75, in __getattr__
    raise Fail("Configuration parameter '"+self.name+"' was not found in configurations dictionary!")
Fail: Configuration parameter 'dfs.namenode.checkpoint.txns' was not found in configurations dictionary!
{code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)