You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Daniel Horak (JIRA)" <ji...@apache.org> on 2015/01/12 13:11:34 UTC

[jira] [Updated] (AMBARI-9085) Hive Metastore didn't start if ambari-agent is running with set environment variable DEBUG

     [ https://issues.apache.org/jira/browse/AMBARI-9085?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Daniel Horak updated AMBARI-9085:
---------------------------------
    Description: 
When is ambari-agent started in terminal with environment variable *DEBUG* set to something, service *Hive Metastore* cannot start. (It is related to ambari-agent on the same server as Hive Metastore.)

How to reproduce:
1. stop Hive Metastore from Ambari web UI.
2. (re)start ambari-agent on the server with Hive Metastore with set env variable DEBUG. 
{noformat}
  DEBUG=1 ambari-agent restart
{noformat}
3. start Hive Metastore from Ambari web UI.

Result:
Task "Hive Metastore Start" fails with following output in log:
{noformat}
stderr:   /var/lib/ambari-agent/data/errors-148.txt

Python script has been killed due to timeout

stdout:   /var/lib/ambari-agent/data/output-148.txt

2015-01-12 10:57:55,900 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-01-12 10:57:55,922 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
2015-01-12 10:57:56,061 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-01-12 10:57:56,063 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-01-12 10:57:56,087 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-01-12 10:57:56,105 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': Template('hadoop-env.sh.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,106 - XmlConfig['core-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
2015-01-12 10:57:56,112 - Generating config: /etc/hadoop/conf/core-site.xml
2015-01-12 10:57:56,112 - File['/etc/hadoop/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None}
2015-01-12 10:57:56,113 - Writing File['/etc/hadoop/conf/core-site.xml'] because contents don't match
2015-01-12 10:57:56,124 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
2015-01-12 10:57:56,146 - Skipping Execute['/bin/echo 0 > /selinux/enforce'] due to only_if
2015-01-12 10:57:56,148 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-i386-32; ln -sf /usr/lib/libsnappy.so /usr/lib/hadoop/lib/native/Linux-i386-32/libsnappy.so'] {}
2015-01-12 10:57:56,178 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-amd64-64; ln -sf /usr/lib64/libsnappy.so /usr/lib/hadoop/lib/native/Linux-amd64-64/libsnappy.so'] {}
2015-01-12 10:57:56,204 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-01-12 10:57:56,205 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-01-12 10:57:56,205 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
2015-01-12 10:57:56,213 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,227 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,228 - File['/etc/hadoop/conf/log4j.properties'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2015-01-12 10:57:56,264 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,265 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2015-01-12 10:57:56,289 - File['/etc/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2015-01-12 10:57:56,476 - Execute['hive mkdir -p /tmp/HDP-artifacts/ ; cp /usr/share/java/mysql-connector-java.jar /usr/lib/hive/lib//mysql-connector-java.jar'] {'creates': '/usr/lib/hive/lib//mysql-connector-java.jar', 'path': ['/bin', '/usr/bin/'], 'not_if': 'test -f /usr/lib/hive/lib//mysql-connector-java.jar'}
2015-01-12 10:57:56,500 - Skipping Execute['hive mkdir -p /tmp/HDP-artifacts/ ; cp /usr/share/java/mysql-connector-java.jar /usr/lib/hive/lib//mysql-connector-java.jar'] due to not_if
2015-01-12 10:57:56,500 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
2015-01-12 10:57:56,502 - XmlConfig['mapred-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/etc/hive/conf.server', 'configurations': ...}
2015-01-12 10:57:56,511 - Generating config: /etc/hive/conf.server/mapred-site.xml
2015-01-12 10:57:56,511 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600}
2015-01-12 10:57:56,513 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
2015-01-12 10:57:56,513 - XmlConfig['hive-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/etc/hive/conf.server', 'configurations': ...}
2015-01-12 10:57:56,518 - Generating config: /etc/hive/conf.server/hive-site.xml
2015-01-12 10:57:56,518 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600}
2015-01-12 10:57:56,520 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
2015-01-12 10:57:56,521 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
2015-01-12 10:57:56,634 - File['/etc/hive/conf.server/hive-env.sh'] {'content': Template('hive-env.sh.j2'), 'owner': 'hive', 'group': 'hadoop'}
2015-01-12 10:57:56,645 - File['/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755}
2015-01-12 10:57:56,655 - Execute['export HIVE_CONF_DIR=/etc/hive/conf.server ; /usr/lib/hive/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED]'] {'not_if': 'export HIVE_CONF_DIR=/etc/hive/conf.server ; /usr/lib/hive/bin/schematool -info -dbType mysql -userName hive -passWord [PROTECTED]'}
{noformat}

With no (or empty) variable DEBUG, everithing works as expected and "Hive Metastore" properly start.
{noformat}
DEBUG= ambari-agent restart
{noformat}

*UPDATE:*
It seems like the root of the issue is directly in Hive, because when I run following db schema check with env variable DEBUG, the command freeze.
{noformat}
export HIVE_CONF_DIR=/etc/hive/conf.server
DEBUG=1 /usr/lib/hive/bin/schematool -info -dbType mysql -userName hive -passWord $PASSWORD
{noformat}
So if you think it is completely Hive issue (or expected behaviour?) and there is nothink to do from Ambari point of view, fell free to close this jira or move it to the Hive.


  was:
When is ambari-agent started in terminal with environment variable *DEBUG* set to something, service *Hive Metastore* cannot start. (It is related to ambari-agent on the same server as Hive Metastore.)

How to reproduce:
1. stop Hive Metastore from Ambari web UI.
2. (re)start ambari-agent on the server with Hive Metastore with set env variable DEBUG. 
{noformat}
  DEBUG=1 ambari-agent restart
{noformat}
3. start Hive Metastore from Ambari web UI.

Result:
Task "Hive Metastore Start" fails with following output in log:
{noformat}
stderr:   /var/lib/ambari-agent/data/errors-148.txt

Python script has been killed due to timeout

stdout:   /var/lib/ambari-agent/data/output-148.txt

2015-01-12 10:57:55,900 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-01-12 10:57:55,922 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
2015-01-12 10:57:56,061 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-01-12 10:57:56,063 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-01-12 10:57:56,087 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-01-12 10:57:56,105 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': Template('hadoop-env.sh.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,106 - XmlConfig['core-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
2015-01-12 10:57:56,112 - Generating config: /etc/hadoop/conf/core-site.xml
2015-01-12 10:57:56,112 - File['/etc/hadoop/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None}
2015-01-12 10:57:56,113 - Writing File['/etc/hadoop/conf/core-site.xml'] because contents don't match
2015-01-12 10:57:56,124 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
2015-01-12 10:57:56,146 - Skipping Execute['/bin/echo 0 > /selinux/enforce'] due to only_if
2015-01-12 10:57:56,148 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-i386-32; ln -sf /usr/lib/libsnappy.so /usr/lib/hadoop/lib/native/Linux-i386-32/libsnappy.so'] {}
2015-01-12 10:57:56,178 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-amd64-64; ln -sf /usr/lib64/libsnappy.so /usr/lib/hadoop/lib/native/Linux-amd64-64/libsnappy.so'] {}
2015-01-12 10:57:56,204 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-01-12 10:57:56,205 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-01-12 10:57:56,205 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
2015-01-12 10:57:56,213 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,227 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,228 - File['/etc/hadoop/conf/log4j.properties'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2015-01-12 10:57:56,264 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
2015-01-12 10:57:56,265 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2015-01-12 10:57:56,289 - File['/etc/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
2015-01-12 10:57:56,476 - Execute['hive mkdir -p /tmp/HDP-artifacts/ ; cp /usr/share/java/mysql-connector-java.jar /usr/lib/hive/lib//mysql-connector-java.jar'] {'creates': '/usr/lib/hive/lib//mysql-connector-java.jar', 'path': ['/bin', '/usr/bin/'], 'not_if': 'test -f /usr/lib/hive/lib//mysql-connector-java.jar'}
2015-01-12 10:57:56,500 - Skipping Execute['hive mkdir -p /tmp/HDP-artifacts/ ; cp /usr/share/java/mysql-connector-java.jar /usr/lib/hive/lib//mysql-connector-java.jar'] due to not_if
2015-01-12 10:57:56,500 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
2015-01-12 10:57:56,502 - XmlConfig['mapred-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/etc/hive/conf.server', 'configurations': ...}
2015-01-12 10:57:56,511 - Generating config: /etc/hive/conf.server/mapred-site.xml
2015-01-12 10:57:56,511 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600}
2015-01-12 10:57:56,513 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
2015-01-12 10:57:56,513 - XmlConfig['hive-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/etc/hive/conf.server', 'configurations': ...}
2015-01-12 10:57:56,518 - Generating config: /etc/hive/conf.server/hive-site.xml
2015-01-12 10:57:56,518 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600}
2015-01-12 10:57:56,520 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
2015-01-12 10:57:56,521 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
2015-01-12 10:57:56,634 - File['/etc/hive/conf.server/hive-env.sh'] {'content': Template('hive-env.sh.j2'), 'owner': 'hive', 'group': 'hadoop'}
2015-01-12 10:57:56,645 - File['/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755}
2015-01-12 10:57:56,655 - Execute['export HIVE_CONF_DIR=/etc/hive/conf.server ; /usr/lib/hive/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED]'] {'not_if': 'export HIVE_CONF_DIR=/etc/hive/conf.server ; /usr/lib/hive/bin/schematool -info -dbType mysql -userName hive -passWord [PROTECTED]'}
{noformat}

With no (or empty) variable DEBUG, everithing works as expected and "Hive Metastore" properly start.
{noformat}
DEBUG= ambari-agent restart
{noformat}


> Hive Metastore didn't start if ambari-agent is running with set environment variable DEBUG
> ------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-9085
>                 URL: https://issues.apache.org/jira/browse/AMBARI-9085
>             Project: Ambari
>          Issue Type: Bug
>    Affects Versions: 1.6.1
>         Environment: # cat /etc/redhat-release 
> Red Hat Enterprise Linux Server release 6.6 (Santiago)
>            Reporter: Daniel Horak
>            Priority: Minor
>
> When is ambari-agent started in terminal with environment variable *DEBUG* set to something, service *Hive Metastore* cannot start. (It is related to ambari-agent on the same server as Hive Metastore.)
> How to reproduce:
> 1. stop Hive Metastore from Ambari web UI.
> 2. (re)start ambari-agent on the server with Hive Metastore with set env variable DEBUG. 
> {noformat}
>   DEBUG=1 ambari-agent restart
> {noformat}
> 3. start Hive Metastore from Ambari web UI.
> Result:
> Task "Hive Metastore Start" fails with following output in log:
> {noformat}
> stderr:   /var/lib/ambari-agent/data/errors-148.txt
> Python script has been killed due to timeout
> stdout:   /var/lib/ambari-agent/data/output-148.txt
> 2015-01-12 10:57:55,900 - Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
> 2015-01-12 10:57:55,922 - Skipping Execute['mkdir -p /tmp/HDP-artifacts/;     curl -kf -x "" --retry 10     http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /tmp/HDP-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
> 2015-01-12 10:57:56,061 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
> 2015-01-12 10:57:56,063 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
> 2015-01-12 10:57:56,087 - Skipping Link['/etc/hadoop/conf'] due to not_if
> 2015-01-12 10:57:56,105 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': Template('hadoop-env.sh.j2'), 'owner': 'hdfs'}
> 2015-01-12 10:57:56,106 - XmlConfig['core-site.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/etc/hadoop/conf', 'configurations': ...}
> 2015-01-12 10:57:56,112 - Generating config: /etc/hadoop/conf/core-site.xml
> 2015-01-12 10:57:56,112 - File['/etc/hadoop/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None}
> 2015-01-12 10:57:56,113 - Writing File['/etc/hadoop/conf/core-site.xml'] because contents don't match
> 2015-01-12 10:57:56,124 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
> 2015-01-12 10:57:56,146 - Skipping Execute['/bin/echo 0 > /selinux/enforce'] due to only_if
> 2015-01-12 10:57:56,148 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-i386-32; ln -sf /usr/lib/libsnappy.so /usr/lib/hadoop/lib/native/Linux-i386-32/libsnappy.so'] {}
> 2015-01-12 10:57:56,178 - Execute['mkdir -p /usr/lib/hadoop/lib/native/Linux-amd64-64; ln -sf /usr/lib64/libsnappy.so /usr/lib/hadoop/lib/native/Linux-amd64-64/libsnappy.so'] {}
> 2015-01-12 10:57:56,204 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
> 2015-01-12 10:57:56,205 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
> 2015-01-12 10:57:56,205 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
> 2015-01-12 10:57:56,213 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
> 2015-01-12 10:57:56,227 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'hdfs'}
> 2015-01-12 10:57:56,228 - File['/etc/hadoop/conf/log4j.properties'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
> 2015-01-12 10:57:56,264 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
> 2015-01-12 10:57:56,265 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
> 2015-01-12 10:57:56,289 - File['/etc/hadoop/conf/configuration.xsl'] {'owner': 'hdfs', 'group': 'hadoop'}
> 2015-01-12 10:57:56,476 - Execute['hive mkdir -p /tmp/HDP-artifacts/ ; cp /usr/share/java/mysql-connector-java.jar /usr/lib/hive/lib//mysql-connector-java.jar'] {'creates': '/usr/lib/hive/lib//mysql-connector-java.jar', 'path': ['/bin', '/usr/bin/'], 'not_if': 'test -f /usr/lib/hive/lib//mysql-connector-java.jar'}
> 2015-01-12 10:57:56,500 - Skipping Execute['hive mkdir -p /tmp/HDP-artifacts/ ; cp /usr/share/java/mysql-connector-java.jar /usr/lib/hive/lib//mysql-connector-java.jar'] due to not_if
> 2015-01-12 10:57:56,500 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
> 2015-01-12 10:57:56,502 - XmlConfig['mapred-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/etc/hive/conf.server', 'configurations': ...}
> 2015-01-12 10:57:56,511 - Generating config: /etc/hive/conf.server/mapred-site.xml
> 2015-01-12 10:57:56,511 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600}
> 2015-01-12 10:57:56,513 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
> 2015-01-12 10:57:56,513 - XmlConfig['hive-site.xml'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0600, 'conf_dir': '/etc/hive/conf.server', 'configurations': ...}
> 2015-01-12 10:57:56,518 - Generating config: /etc/hive/conf.server/hive-site.xml
> 2015-01-12 10:57:56,518 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0600}
> 2015-01-12 10:57:56,520 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
> 2015-01-12 10:57:56,521 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://dhcp-75-204.lab.eng.brq.redhat.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
> 2015-01-12 10:57:56,634 - File['/etc/hive/conf.server/hive-env.sh'] {'content': Template('hive-env.sh.j2'), 'owner': 'hive', 'group': 'hadoop'}
> 2015-01-12 10:57:56,645 - File['/tmp/start_metastore_script'] {'content': StaticFile('startMetastore.sh'), 'mode': 0755}
> 2015-01-12 10:57:56,655 - Execute['export HIVE_CONF_DIR=/etc/hive/conf.server ; /usr/lib/hive/bin/schematool -initSchema -dbType mysql -userName hive -passWord [PROTECTED]'] {'not_if': 'export HIVE_CONF_DIR=/etc/hive/conf.server ; /usr/lib/hive/bin/schematool -info -dbType mysql -userName hive -passWord [PROTECTED]'}
> {noformat}
> With no (or empty) variable DEBUG, everithing works as expected and "Hive Metastore" properly start.
> {noformat}
> DEBUG= ambari-agent restart
> {noformat}
> *UPDATE:*
> It seems like the root of the issue is directly in Hive, because when I run following db schema check with env variable DEBUG, the command freeze.
> {noformat}
> export HIVE_CONF_DIR=/etc/hive/conf.server
> DEBUG=1 /usr/lib/hive/bin/schematool -info -dbType mysql -userName hive -passWord $PASSWORD
> {noformat}
> So if you think it is completely Hive issue (or expected behaviour?) and there is nothink to do from Ambari point of view, fell free to close this jira or move it to the Hive.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)