You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ambari.apache.org by "Jonathan Hurley (JIRA)" <ji...@apache.org> on 2016/07/05 00:58:11 UTC

[jira] [Created] (AMBARI-17557) Hive WebHCat Service Check Fails During Upgrade Due To Missing Configurations Files

Jonathan Hurley created AMBARI-17557:
----------------------------------------

             Summary: Hive WebHCat Service Check Fails During Upgrade Due To Missing Configurations Files
                 Key: AMBARI-17557
                 URL: https://issues.apache.org/jira/browse/AMBARI-17557
             Project: Ambari
          Issue Type: Bug
          Components: ambari-server
    Affects Versions: 2.4.0
            Reporter: Jonathan Hurley
            Assignee: Jonathan Hurley
            Priority: Blocker
             Fix For: 2.4.0


During an upgrade from HDP 2.2 to HDP 2.4, the Hive WebHCat Service Check fails several times:

{code}
2016-06-29 02:24:49,660 - Retrying after 5 seconds. Reason: Execution of '/var/lib/ambari-agent/tmp/templetonSmoke.sh natr66-cuts-c10to25rhel6unsr-10.openstacklocal ambari-qa 50111 idtest.ambari-qa.1467167063.06.pig no_keytab false kinit no_principal /var/lib/ambari-agent/tmp' returned 1. Templeton Smoke Test (pig cmd): Failed. : {\"error\":\"Failed to get WebHCatShim\"}http_code <500>
Templeton Smoke Test (pig cmd): Failed. : {\"error\":\"Failed to get WebHCatShim\"}http_code <500>
{code}

Caused by:
{code}
{code}
FATAL | 29 Jun 2016 02:24:36,920 | org.apache.hadoop.conf.Configuration | error parsing conf file:/etc/hive-webhcat/conf/webhcat-site.xml
java.io.FileNotFoundException: /etc/hive-webhcat/conf/webhcat-site.xml (No such file or directory)
	at java.io.FileInputStream.open(Native Method)
	at java.io.FileInputStream.<init>(FileInputStream.java:146)
	at java.io.FileInputStream.<init>(FileInputStream.java:101)
	at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
	at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
	at java.net.URL.openStream(URL.java:1037)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2342)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2410)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2376)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2283)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:888)
	at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:907)
	at org.apache.hadoop.conf.Configuration.getLong(Configuration.java:1182)
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:139)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapred.ResourceMgrDelegate.serviceInit(ResourceMgrDelegate.java:102)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapred.ResourceMgrDelegate.<init>(ResourceMgrDelegate.java:96)
	at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:112)
	at org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34)
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:95)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
	at org.apache.hadoop.mapred.JobClient.init(JobClient.java:485)
	at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:475)
	at org.apache.hadoop.mapred.WebHCatJTShim23$1.run(WebHCatJTShim23.java:60)
	at org.apache.hadoop.mapred.WebHCatJTShim23$1.run(WebHCatJTShim23.java:57)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.mapred.WebHCatJTShim23.<init>(WebHCatJTShim23.java:57)
	at org.apache.hadoop.hive.shims.Hadoop23Shims.getWebHCatShim(Hadoop23Shims.java:500)
{code}

The reason for the missing file is that the symlinks are messed up when converting HDP 2.2 hard {{/etc/component/conf}} directories to symlinks:

{code:title=Wrong}
[root@c6402 hive-webhcat]# ll /etc/hive-webhcat/
total 8
drwxr-xr-x 3 root root   4096 Jul  5 00:49 2.4.2.0-236
lrwxrwxrwx 1 root root     29 Jul  5 00:49 conf -> /etc/hive-hcatalog/conf.backup
drwxr-xr-x 2 hcat hadoop 4096 Jul  4 23:25 conf.backup
{code}

{code:title=Right}
[root@c6402 hive-webhcat]# ll /etc/hive-webhcat/
total 8
drwxr-xr-x 3 root root   4096 Jul  5 00:49 2.4.2.0-236
lrwxrwxrwx 1 root root     29 Jul  5 00:49 conf -> /etc/hive-webhcat/conf.backup
drwxr-xr-x 2 hcat hadoop 4096 Jul  4 23:25 conf.backup
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)