You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ambari.apache.org by Shaik M <mu...@gmail.com> on 2015/05/04 12:55:29 UTC

Ambari Unable to start Hive server 2 after enabling security

Hi,

I am using Ambari 1.7 and HDP 2.2.4. I have enabled security in this
cluster.

After enabling the security Ambari unable start the Hive server 2.

I have verified keytabs and it's working fine. Please find the below ambari
string process log below. not shown any output in standerr window.

please let me know the how to resolve this issue.

Thanks,
Shaik M

stdout:   /var/lib/ambari-agent/data/output-1832.txt

2015-05-04 10:47:25,616 - Execute['mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x ""
--retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']
{'environment': ..., 'not_if': 'test -e
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip',
'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-05-04 10:47:25,646 - Skipping Execute['mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x ""
--retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']
due to not_if
2015-05-04 10:47:25,648 - Execute['rm -f local_policy.jar; rm -f
US_export_policy.jar; unzip -o -j -q
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip']
{'path': ['/bin/', '/usr/bin'], 'only_if': 'test -e
/usr/jdk64/jdk1.7.0_67/jre/lib/security && test -f
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip',
'cwd': '/usr/jdk64/jdk1.7.0_67/jre/lib/security'}
2015-05-04 10:47:25,718 - Group['hadoop'] {'ignore_failures': False}
2015-05-04 10:47:25,720 - Modifying group hadoop
2015-05-04 10:47:25,784 - Group['nobody'] {'ignore_failures': False}
2015-05-04 10:47:25,785 - Modifying group nobody
2015-05-04 10:47:25,839 - Group['users'] {'ignore_failures': False}
2015-05-04 10:47:25,840 - Modifying group users
2015-05-04 10:47:25,886 - Group['nagios'] {'ignore_failures': False}
2015-05-04 10:47:25,887 - Modifying group nagios
2015-05-04 10:47:25,936 - User['nobody'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'nobody']}
2015-05-04 10:47:25,937 - Modifying user nobody
2015-05-04 10:47:26,012 - User['oozie'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'users']}
2015-05-04 10:47:26,013 - Modifying user oozie
2015-05-04 10:47:26,063 - User['hive'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,064 - Modifying user hive
2015-05-04 10:47:26,098 - User['mapred'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,099 - Modifying user mapred
2015-05-04 10:47:26,131 - User['nagios'] {'gid': 'nagios',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,132 - Modifying user nagios
2015-05-04 10:47:26,165 - User['ambari-qa'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'users']}
2015-05-04 10:47:26,166 - Modifying user ambari-qa
2015-05-04 10:47:26,199 - User['zookeeper'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,199 - Modifying user zookeeper
2015-05-04 10:47:26,232 - User['tez'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'users']}
2015-05-04 10:47:26,233 - Modifying user tez
2015-05-04 10:47:26,265 - User['hdfs'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,266 - Modifying user hdfs
2015-05-04 10:47:26,298 - User['sqoop'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,299 - Modifying user sqoop
2015-05-04 10:47:26,332 - User['hcat'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,332 - Modifying user hcat
2015-05-04 10:47:26,365 - User['yarn'] {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,366 - Modifying user yarn
2015-05-04 10:47:26,399 -
File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-05-04 10:47:26,401 -
Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-05-04 10:47:26,431 - Skipping
Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] due to not_if
2015-05-04 10:47:26,432 - Directory['/etc/hadoop/conf.empty']
{'owner': 'root', 'group': 'root', 'recursive': True}
2015-05-04 10:47:26,433 - Link['/etc/hadoop/conf'] {'not_if': 'ls
/etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-05-04 10:47:26,462 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-05-04 10:47:26,491 - File['/etc/hadoop/conf/hadoop-env.sh']
{'content': InlineTemplate(...), 'owner': 'root'}
2015-05-04 10:47:26,516 - Execute['/bin/echo 0 > /selinux/enforce']
{'only_if': 'test -f /selinux/enforce'}
2015-05-04 10:47:26,577 - Directory['/var/log/hadoop'] {'owner':
'root', 'group': 'hadoop', 'mode': 0775, 'recursive': True}
2015-05-04 10:47:26,578 - Directory['/var/run/hadoop'] {'owner':
'root', 'group': 'root', 'recursive': True}
2015-05-04 10:47:26,579 - Directory['/tmp/hadoop-hdfs'] {'owner':
'hdfs', 'recursive': True}
2015-05-04 10:47:26,590 -
File['/etc/hadoop/conf/commons-logging.properties'] {'content':
Template('commons-logging.properties.j2'), 'owner': 'root'}
2015-05-04 10:47:26,595 - File['/etc/hadoop/conf/health_check']
{'content': Template('health_check-v2.j2'), 'owner': 'root'}
2015-05-04 10:47:26,596 - File['/etc/hadoop/conf/log4j.properties']
{'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:26,608 -
File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content':
Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
2015-05-04 10:47:26,609 -
File['/etc/hadoop/conf/task-log4j.properties'] {'content':
StaticFile('task-log4j.properties'), 'mode': 0755}
2015-05-04 10:47:26,956 - Execute['kill `cat
/var/run/hive/hive-server.pid` >/dev/null 2>&1 && rm -f
/var/run/hive/hive-server.pid'] {'not_if': '! (ls
/var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat
/var/run/hive/hive-server.pid` >/dev/null 2>&1)'}
2015-05-04 10:47:27,068 - HdfsDirectory['/apps/hive/warehouse']
{'security_enabled': True, 'keytab':
'/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir':
'/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local':
'/usr/bin/kinit', 'mode': 0777, 'owner': 'hive', 'bin_dir':
'/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
2015-05-04 10:47:27,069 - HdfsDirectory['/user/hive']
{'security_enabled': True, 'keytab':
'/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir':
'/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local':
'/usr/bin/kinit', 'mode': 0700, 'owner': 'hive', 'bin_dir':
'/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
2015-05-04 10:47:27,070 - HdfsDirectory['None'] {'security_enabled':
True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab',
'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs',
'kinit_path_local': '/usr/bin/kinit', 'action': ['create'], 'bin_dir':
'/usr/hdp/current/hadoop-client/bin'}
2015-05-04 10:47:27,073 - Execute['/usr/bin/kinit -kt
/etc/security/keytabs/hdfs.headless.keytab hdfs'] {'user': 'hdfs'}
2015-05-04 10:47:27,850 - Execute['hadoop --config /etc/hadoop/conf fs
-mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"`
/apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs
-chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf
fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs
-chown  hive /apps/hive/warehouse /user/hive'] {'not_if': "su - hdfs
-c 'export PATH=$PATH:/usr/hdp/current/hadoop-client/bin ; hadoop
--config /etc/hadoop/conf fs -ls /apps/hive/warehouse /user/hive'",
'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
2015-05-04 10:47:31,133 - Skipping Execute['hadoop --config
/etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo
"-p"` /apps/hive/warehouse /user/hive && hadoop --config
/etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop
--config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config
/etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] due
to not_if
2015-05-04 10:47:31,134 - Directory['/etc/hive/conf.server'] {'owner':
'hive', 'group': 'hadoop', 'recursive': True}
2015-05-04 10:47:31,135 - XmlConfig['mapred-site.xml'] {'group':
'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644,
'configuration_attributes': ..., 'owner': 'hive', 'configurations':
...}
2015-05-04 10:47:31,168 - Generating config:
/etc/hive/conf.server/mapred-site.xml
2015-05-04 10:47:31,169 -
File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive',
'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644,
'encoding': 'UTF-8'}
2015-05-04 10:47:31,173 - Writing
File['/etc/hive/conf.server/mapred-site.xml'] because contents don't
match
2015-05-04 10:47:31,174 -
File['/etc/hive/conf.server/hive-default.xml.template'] {'owner':
'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,175 -
File['/etc/hive/conf.server/hive-env.sh.template'] {'owner': 'hive',
'group': 'hadoop'}
2015-05-04 10:47:31,177 -
File['/etc/hive/conf.server/hive-exec-log4j.properties'] {'content':
'...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,178 -
File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...',
'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,179 - Directory['/etc/hive/conf'] {'owner':
'hive', 'group': 'hadoop', 'recursive': True}
2015-05-04 10:47:31,180 - XmlConfig['mapred-site.xml'] {'group':
'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644,
'configuration_attributes': ..., 'owner': 'hive', 'configurations':
...}
2015-05-04 10:47:31,202 - Generating config: /etc/hive/conf/mapred-site.xml
2015-05-04 10:47:31,203 - File['/etc/hive/conf/mapred-site.xml']
{'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop',
'mode': 0644, 'encoding': 'UTF-8'}
2015-05-04 10:47:31,207 - Writing
File['/etc/hive/conf/mapred-site.xml'] because contents don't match
2015-05-04 10:47:31,208 -
File['/etc/hive/conf/hive-default.xml.template'] {'owner': 'hive',
'group': 'hadoop'}
2015-05-04 10:47:31,209 - File['/etc/hive/conf/hive-env.sh.template']
{'owner': 'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,211 -
File['/etc/hive/conf/hive-exec-log4j.properties'] {'content': '...',
'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,212 - File['/etc/hive/conf/hive-log4j.properties']
{'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,213 - XmlConfig['hive-site.xml'] {'group':
'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644,
'configuration_attributes': ..., 'owner': 'hive', 'configurations':
...}
2015-05-04 10:47:31,236 - Generating config: /etc/hive/conf.server/hive-site.xml
2015-05-04 10:47:31,237 - File['/etc/hive/conf.server/hive-site.xml']
{'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop',
'mode': 0644, 'encoding': 'UTF-8'}
2015-05-04 10:47:31,245 - Writing
File['/etc/hive/conf.server/hive-site.xml'] because contents don't
match
2015-05-04 10:47:31,251 - File['/etc/hive/conf.server/hive-env.sh']
{'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,253 - Execute['hive mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f
/usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp
/usr/share/java/mysql-connector-java.jar
/usr/hdp/current/hive-client/lib/mysql-connector-java.jar']
{'environment': ..., 'path': ['/bin', '/usr/bin/'], 'creates':
'/usr/hdp/current/hive-client/lib/mysql-connector-java.jar', 'not_if':
'test -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'}
2015-05-04 10:47:31,282 - Skipping Execute['hive mkdir -p
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f
/usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp
/usr/share/java/mysql-connector-java.jar
/usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] due to
not_if
2015-05-04 10:47:31,285 - Execute['/bin/sh -c 'cd
/usr/lib/ambari-agent/ && curl -kf -x "" --retry 5
http://sv2lxbdp2mst05.corp.equinix.com:8080/resources/DBConnectionVerification.jar
-o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[
-f DBConnectionVerification.jar]'}
2015-05-04 10:47:31,370 -
File['/var/lib/ambari-agent/data/tmp/start_hiveserver2_script']
{'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
2015-05-04 10:47:31,372 - Directory['/var/run/hive'] {'owner': 'hive',
'group': 'hadoop', 'mode': 0755, 'recursive': True}
2015-05-04 10:47:31,372 - Directory['/var/log/hive'] {'owner': 'hive',
'group': 'hadoop', 'mode': 0755, 'recursive': True}
2015-05-04 10:47:31,373 - Directory['/var/lib/hive'] {'owner': 'hive',
'group': 'hadoop', 'mode': 0755, 'recursive': True}
2015-05-04 10:47:31,453 - Could not verify HDP version by calling
'/usr/bin/hdp-select versions > /tmp/tmpTrvIN3'. Return Code: 0,
Output: 2.2.4.2-2
.
2015-05-04 10:47:31,531 - Could not verify HDP version by calling
'/usr/bin/hdp-select versions > /tmp/tmpHFa97f'. Return Code: 0,
Output: 2.2.4.2-2
.
2015-05-04 10:47:31,600 - Execute['env
JAVA_HOME=/usr/jdk64/jdk1.7.0_67
/var/lib/ambari-agent/data/tmp/start_hiveserver2_script
/var/log/hive/hive-server2.out /var/log/hive/hive-server2.log
/var/run/hive/hive-server.pid /etc/hive/conf.server /var/log/hive']
{'environment': ..., 'not_if': 'ls /var/run/hive/hive-server.pid
>/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null
2>&1', 'user': 'hive', 'path':
['/usr/lib/ambari-server/*:/sbin:/usr/sbin:/bin:/usr/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
2015-05-04 10:47:31,693 - Execute['/usr/jdk64/jdk1.7.0_67/bin/java -cp
/usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/share/java/mysql-connector-java.jar
org.apache.ambari.server.DBConnectionVerification
'jdbc:mysql://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true'
hive [PROTECTED] com.mysql.jdbc.Driver'] {'path':
['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5,
'try_sleep': 10}

Re: Ambari Unable to start Hive server 2 after enabling security

Posted by Shaik M <mu...@gmail.com>.
Hi Alex/Rob,

Thank you for giving the patch for this issue.

After applying the patch, Ambari able to start "hiveserver2" without any
manual kinit.

Regards,
Shaik

On 6 May 2015 at 07:22, Alexander Denissov <ad...@pivotal.io> wrote:

> To have Ambari 1.7.0 work with HDP 2.2.4 and secure Hive you will most
> likely need to apply a patch (from Ambari 2.0) :
> https://issues.apache.org/jira/browse/AMBARI-9535
>
> --
> Thanks,
> Alex.
>
> On Tue, May 5, 2015 at 5:24 AM, Robert Levas <rl...@hortonworks.com>
> wrote:
>
>>  Hi Shaik…
>>
>>  That is a good question. According to
>> https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2,
>> it doesn’t appear that a kinit is needed before starting up the server.
>>
>>  Rob
>>
>>
>>   From: Shaik M <mu...@gmail.com>
>> Reply-To: "user@ambari.apache.org" <us...@ambari.apache.org>
>> Date: Tuesday, May 5, 2015 at 5:15 AM
>> To: "user@ambari.apache.org" <us...@ambari.apache.org>
>> Subject: Re: Ambari Unable to start Hive server 2 after enabling security
>>
>>   Ambari Team,
>>
>>  can you please check, why it is required to do manual "kinit" for
>> HiveServer2 startup?
>>
>>
>>
>> On 4 May 2015 at 22:24, Shaik M <mu...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>>  After verifying the Amabri agent log, I logged in as "hive"  user and
>>> I ran below command.
>>>
>>>  /usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs
>>>
>>>  after that I have started "hiveserver2" service from Amabri and it
>>> started the service. Now, I can able to connect to beeline client.
>>>
>>>  I have re-verified, with destroying the key and the hiveserver2
>>> service went down after destroying the key.
>>>
>>>  Look like Ambari having some issue to initiate the HiveServer2 keytabs.
>>>
>>>  -Shaik
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On 4 May 2015 at 21:54, Shaik M <mu...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>>
>>>>  *I am getting following exception in Hiveserver2 log file:*
>>>>
>>>>  2015-05-04 13:45:23,542 WARN  [main]: ipc.Client
>>>> (Client.java:run(676)) - Exception encountered while connecting to the
>>>> server : javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>> find any Kerberos tgt)]
>>>> 2015-05-04 13:45:23,543 INFO  [main]: retry.RetryInvocationHandler
>>>> (RetryInvocationHandler.java:invoke(140)) - Exception while invoking
>>>> getFileInfo of class ClientNamenodeProtocolTranslatorPB over
>>>> sv2lxbdp2mst05.corp.host.com/10.192.149.187:8020 after 8 fail over
>>>> attempts. Trying to fail over immediately.
>>>> java.io.IOException: Failed on local exception: java.io.IOException:
>>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>>> find any Kerberos tgt)];
>>>>
>>>>  Thanks,
>>>> Shaik
>>>>
>>>>  (If it is not right place to post this query, please help me to know
>>>> the correct group)
>>>>
>>>> On 4 May 2015 at 18:55, Shaik M <mu...@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>>  I am using Ambari 1.7 and HDP 2.2.4. I have enabled security in this
>>>>> cluster.
>>>>>
>>>>>  After enabling the security Ambari unable start the Hive server 2.
>>>>>
>>>>>  I have verified keytabs and it's working fine. Please find the below
>>>>> ambari string process log below. not shown any output in standerr window.
>>>>>
>>>>>  please let me know the how to resolve this issue.
>>>>>
>>>>>  Thanks,
>>>>> Shaik M
>>>>>
>>>>>   stdout:   /var/lib/ambari-agent/data/output-1832.txt
>>>>>
>>>>> 2015-05-04 10:47:25,616 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>>>>> 2015-05-04 10:47:25,646 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
>>>>> 2015-05-04 10:47:25,648 - Execute['rm -f local_policy.jar; rm -f US_export_policy.jar; unzip -o -j -q /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'path': ['/bin/', '/usr/bin'], 'only_if': 'test -e /usr/jdk64/jdk1.7.0_67/jre/lib/security && test -f /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'cwd': '/usr/jdk64/jdk1.7.0_67/jre/lib/security'}
>>>>> 2015-05-04 10:47:25,718 - Group['hadoop'] {'ignore_failures': False}
>>>>> 2015-05-04 10:47:25,720 - Modifying group hadoop
>>>>> 2015-05-04 10:47:25,784 - Group['nobody'] {'ignore_failures': False}
>>>>> 2015-05-04 10:47:25,785 - Modifying group nobody
>>>>> 2015-05-04 10:47:25,839 - Group['users'] {'ignore_failures': False}
>>>>> 2015-05-04 10:47:25,840 - Modifying group users
>>>>> 2015-05-04 10:47:25,886 - Group['nagios'] {'ignore_failures': False}
>>>>> 2015-05-04 10:47:25,887 - Modifying group nagios
>>>>> 2015-05-04 10:47:25,936 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
>>>>> 2015-05-04 10:47:25,937 - Modifying user nobody
>>>>> 2015-05-04 10:47:26,012 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>>>> 2015-05-04 10:47:26,013 - Modifying user oozie
>>>>> 2015-05-04 10:47:26,063 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,064 - Modifying user hive
>>>>> 2015-05-04 10:47:26,098 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,099 - Modifying user mapred
>>>>> 2015-05-04 10:47:26,131 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,132 - Modifying user nagios
>>>>> 2015-05-04 10:47:26,165 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>>>> 2015-05-04 10:47:26,166 - Modifying user ambari-qa
>>>>> 2015-05-04 10:47:26,199 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,199 - Modifying user zookeeper
>>>>> 2015-05-04 10:47:26,232 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>>>> 2015-05-04 10:47:26,233 - Modifying user tez
>>>>> 2015-05-04 10:47:26,265 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,266 - Modifying user hdfs
>>>>> 2015-05-04 10:47:26,298 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,299 - Modifying user sqoop
>>>>> 2015-05-04 10:47:26,332 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,332 - Modifying user hcat
>>>>> 2015-05-04 10:47:26,365 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>>> 2015-05-04 10:47:26,366 - Modifying user yarn
>>>>> 2015-05-04 10:47:26,399 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>>>>> 2015-05-04 10:47:26,401 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>>>>> 2015-05-04 10:47:26,431 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
>>>>> 2015-05-04 10:47:26,432 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
>>>>> 2015-05-04 10:47:26,433 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>>>>> 2015-05-04 10:47:26,462 - Skipping Link['/etc/hadoop/conf'] due to not_if
>>>>> 2015-05-04 10:47:26,491 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root'}
>>>>> 2015-05-04 10:47:26,516 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
>>>>> 2015-05-04 10:47:26,577 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'hadoop', 'mode': 0775, 'recursive': True}
>>>>> 2015-05-04 10:47:26,578 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
>>>>> 2015-05-04 10:47:26,579 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
>>>>> 2015-05-04 10:47:26,590 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
>>>>> 2015-05-04 10:47:26,595 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'root'}
>>>>> 2015-05-04 10:47:26,596 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
>>>>> 2015-05-04 10:47:26,608 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
>>>>> 2015-05-04 10:47:26,609 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
>>>>> 2015-05-04 10:47:26,956 - Execute['kill `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1 && rm -f /var/run/hive/hive-server.pid'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1)'}
>>>>> 2015-05-04 10:47:27,068 - HdfsDirectory['/apps/hive/warehouse'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0777, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>>>>> 2015-05-04 10:47:27,069 - HdfsDirectory['/user/hive'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0700, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>>>>> 2015-05-04 10:47:27,070 - HdfsDirectory['None'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'action': ['create'], 'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
>>>>> 2015-05-04 10:47:27,073 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs'] {'user': 'hdfs'}
>>>>> 2015-05-04 10:47:27,850 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] {'not_if': "su - hdfs -c 'export PATH=$PATH:/usr/hdp/current/hadoop-client/bin ; hadoop --config /etc/hadoop/conf fs -ls /apps/hive/warehouse /user/hive'", 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>>>>> 2015-05-04 10:47:31,133 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] due to not_if
>>>>> 2015-05-04 10:47:31,134 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>>>>> 2015-05-04 10:47:31,135 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>>>> 2015-05-04 10:47:31,168 - Generating config: /etc/hive/conf.server/mapred-site.xml
>>>>> 2015-05-04 10:47:31,169 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>>>> 2015-05-04 10:47:31,173 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
>>>>> 2015-05-04 10:47:31,174 - File['/etc/hive/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>>> 2015-05-04 10:47:31,175 - File['/etc/hive/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>>> 2015-05-04 10:47:31,177 - File['/etc/hive/conf.server/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>>> 2015-05-04 10:47:31,178 - File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>>> 2015-05-04 10:47:31,179 - Directory['/etc/hive/conf'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>>>>> 2015-05-04 10:47:31,180 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>>>> 2015-05-04 10:47:31,202 - Generating config: /etc/hive/conf/mapred-site.xml
>>>>> 2015-05-04 10:47:31,203 - File['/etc/hive/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>>>> 2015-05-04 10:47:31,207 - Writing File['/etc/hive/conf/mapred-site.xml'] because contents don't match
>>>>> 2015-05-04 10:47:31,208 - File['/etc/hive/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>>> 2015-05-04 10:47:31,209 - File['/etc/hive/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>>> 2015-05-04 10:47:31,211 - File['/etc/hive/conf/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>>> 2015-05-04 10:47:31,212 - File['/etc/hive/conf/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>>> 2015-05-04 10:47:31,213 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>>>> 2015-05-04 10:47:31,236 - Generating config: /etc/hive/conf.server/hive-site.xml
>>>>> 2015-05-04 10:47:31,237 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>>>> 2015-05-04 10:47:31,245 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
>>>>> 2015-05-04 10:47:31,251 - File['/etc/hive/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
>>>>> 2015-05-04 10:47:31,253 - Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] {'environment': ..., 'path': ['/bin', '/usr/bin/'], 'creates': '/usr/hdp/current/hive-client/lib/mysql-connector-java.jar', 'not_if': 'test -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'}
>>>>> 2015-05-04 10:47:31,282 - Skipping Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] due to not_if
>>>>> 2015-05-04 10:47:31,285 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://sv2lxbdp2mst05.corp.equinix.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
>>>>> 2015-05-04 10:47:31,370 - File['/var/lib/ambari-agent/data/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
>>>>> 2015-05-04 10:47:31,372 - Directory['/var/run/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>>>> 2015-05-04 10:47:31,372 - Directory['/var/log/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>>>> 2015-05-04 10:47:31,373 - Directory['/var/lib/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>>>> 2015-05-04 10:47:31,453 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpTrvIN3'. Return Code: 0, Output: 2.2.4.2-2
>>>>> .
>>>>> 2015-05-04 10:47:31,531 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpHFa97f'. Return Code: 0, Output: 2.2.4.2-2
>>>>> .
>>>>> 2015-05-04 10:47:31,600 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.7.0_67 /var/lib/ambari-agent/data/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.log /var/run/hive/hive-server.pid /etc/hive/conf.server /var/log/hive'] {'environment': ..., 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/lib/ambari-server/*:/sbin:/usr/sbin:/bin:/usr/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
>>>>> 2015-05-04 10:47:31,693 - Execute['/usr/jdk64/jdk1.7.0_67/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/share/java/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true' hive [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Ambari Unable to start Hive server 2 after enabling security

Posted by Alexander Denissov <ad...@pivotal.io>.
To have Ambari 1.7.0 work with HDP 2.2.4 and secure Hive you will most
likely need to apply a patch (from Ambari 2.0) :
https://issues.apache.org/jira/browse/AMBARI-9535

--
Thanks,
Alex.

On Tue, May 5, 2015 at 5:24 AM, Robert Levas <rl...@hortonworks.com> wrote:

>  Hi Shaik…
>
>  That is a good question. According to
> https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2,
> it doesn’t appear that a kinit is needed before starting up the server.
>
>  Rob
>
>
>   From: Shaik M <mu...@gmail.com>
> Reply-To: "user@ambari.apache.org" <us...@ambari.apache.org>
> Date: Tuesday, May 5, 2015 at 5:15 AM
> To: "user@ambari.apache.org" <us...@ambari.apache.org>
> Subject: Re: Ambari Unable to start Hive server 2 after enabling security
>
>   Ambari Team,
>
>  can you please check, why it is required to do manual "kinit" for
> HiveServer2 startup?
>
>
>
> On 4 May 2015 at 22:24, Shaik M <mu...@gmail.com> wrote:
>
>> Hi,
>>
>>  After verifying the Amabri agent log, I logged in as "hive"  user and I
>> ran below command.
>>
>>  /usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs
>>
>>  after that I have started "hiveserver2" service from Amabri and it
>> started the service. Now, I can able to connect to beeline client.
>>
>>  I have re-verified, with destroying the key and the hiveserver2 service
>> went down after destroying the key.
>>
>>  Look like Ambari having some issue to initiate the HiveServer2 keytabs.
>>
>>  -Shaik
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On 4 May 2015 at 21:54, Shaik M <mu...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>>
>>>  *I am getting following exception in Hiveserver2 log file:*
>>>
>>>  2015-05-04 13:45:23,542 WARN  [main]: ipc.Client
>>> (Client.java:run(676)) - Exception encountered while connecting to the
>>> server : javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>> find any Kerberos tgt)]
>>> 2015-05-04 13:45:23,543 INFO  [main]: retry.RetryInvocationHandler
>>> (RetryInvocationHandler.java:invoke(140)) - Exception while invoking
>>> getFileInfo of class ClientNamenodeProtocolTranslatorPB over
>>> sv2lxbdp2mst05.corp.host.com/10.192.149.187:8020 after 8 fail over
>>> attempts. Trying to fail over immediately.
>>> java.io.IOException: Failed on local exception: java.io.IOException:
>>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>>> GSSException: No valid credentials provided (Mechanism level: Failed to
>>> find any Kerberos tgt)];
>>>
>>>  Thanks,
>>> Shaik
>>>
>>>  (If it is not right place to post this query, please help me to know
>>> the correct group)
>>>
>>> On 4 May 2015 at 18:55, Shaik M <mu...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>>  I am using Ambari 1.7 and HDP 2.2.4. I have enabled security in this
>>>> cluster.
>>>>
>>>>  After enabling the security Ambari unable start the Hive server 2.
>>>>
>>>>  I have verified keytabs and it's working fine. Please find the below
>>>> ambari string process log below. not shown any output in standerr window.
>>>>
>>>>  please let me know the how to resolve this issue.
>>>>
>>>>  Thanks,
>>>> Shaik M
>>>>
>>>>   stdout:   /var/lib/ambari-agent/data/output-1832.txt
>>>>
>>>> 2015-05-04 10:47:25,616 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>>>> 2015-05-04 10:47:25,646 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
>>>> 2015-05-04 10:47:25,648 - Execute['rm -f local_policy.jar; rm -f US_export_policy.jar; unzip -o -j -q /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'path': ['/bin/', '/usr/bin'], 'only_if': 'test -e /usr/jdk64/jdk1.7.0_67/jre/lib/security && test -f /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'cwd': '/usr/jdk64/jdk1.7.0_67/jre/lib/security'}
>>>> 2015-05-04 10:47:25,718 - Group['hadoop'] {'ignore_failures': False}
>>>> 2015-05-04 10:47:25,720 - Modifying group hadoop
>>>> 2015-05-04 10:47:25,784 - Group['nobody'] {'ignore_failures': False}
>>>> 2015-05-04 10:47:25,785 - Modifying group nobody
>>>> 2015-05-04 10:47:25,839 - Group['users'] {'ignore_failures': False}
>>>> 2015-05-04 10:47:25,840 - Modifying group users
>>>> 2015-05-04 10:47:25,886 - Group['nagios'] {'ignore_failures': False}
>>>> 2015-05-04 10:47:25,887 - Modifying group nagios
>>>> 2015-05-04 10:47:25,936 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
>>>> 2015-05-04 10:47:25,937 - Modifying user nobody
>>>> 2015-05-04 10:47:26,012 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>>> 2015-05-04 10:47:26,013 - Modifying user oozie
>>>> 2015-05-04 10:47:26,063 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,064 - Modifying user hive
>>>> 2015-05-04 10:47:26,098 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,099 - Modifying user mapred
>>>> 2015-05-04 10:47:26,131 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,132 - Modifying user nagios
>>>> 2015-05-04 10:47:26,165 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>>> 2015-05-04 10:47:26,166 - Modifying user ambari-qa
>>>> 2015-05-04 10:47:26,199 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,199 - Modifying user zookeeper
>>>> 2015-05-04 10:47:26,232 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>>> 2015-05-04 10:47:26,233 - Modifying user tez
>>>> 2015-05-04 10:47:26,265 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,266 - Modifying user hdfs
>>>> 2015-05-04 10:47:26,298 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,299 - Modifying user sqoop
>>>> 2015-05-04 10:47:26,332 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,332 - Modifying user hcat
>>>> 2015-05-04 10:47:26,365 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>>> 2015-05-04 10:47:26,366 - Modifying user yarn
>>>> 2015-05-04 10:47:26,399 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>>>> 2015-05-04 10:47:26,401 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>>>> 2015-05-04 10:47:26,431 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
>>>> 2015-05-04 10:47:26,432 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
>>>> 2015-05-04 10:47:26,433 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>>>> 2015-05-04 10:47:26,462 - Skipping Link['/etc/hadoop/conf'] due to not_if
>>>> 2015-05-04 10:47:26,491 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root'}
>>>> 2015-05-04 10:47:26,516 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
>>>> 2015-05-04 10:47:26,577 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'hadoop', 'mode': 0775, 'recursive': True}
>>>> 2015-05-04 10:47:26,578 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
>>>> 2015-05-04 10:47:26,579 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
>>>> 2015-05-04 10:47:26,590 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
>>>> 2015-05-04 10:47:26,595 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'root'}
>>>> 2015-05-04 10:47:26,596 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
>>>> 2015-05-04 10:47:26,608 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
>>>> 2015-05-04 10:47:26,609 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
>>>> 2015-05-04 10:47:26,956 - Execute['kill `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1 && rm -f /var/run/hive/hive-server.pid'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1)'}
>>>> 2015-05-04 10:47:27,068 - HdfsDirectory['/apps/hive/warehouse'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0777, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>>>> 2015-05-04 10:47:27,069 - HdfsDirectory['/user/hive'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0700, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>>>> 2015-05-04 10:47:27,070 - HdfsDirectory['None'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'action': ['create'], 'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
>>>> 2015-05-04 10:47:27,073 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs'] {'user': 'hdfs'}
>>>> 2015-05-04 10:47:27,850 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] {'not_if': "su - hdfs -c 'export PATH=$PATH:/usr/hdp/current/hadoop-client/bin ; hadoop --config /etc/hadoop/conf fs -ls /apps/hive/warehouse /user/hive'", 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>>>> 2015-05-04 10:47:31,133 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] due to not_if
>>>> 2015-05-04 10:47:31,134 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>>>> 2015-05-04 10:47:31,135 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>>> 2015-05-04 10:47:31,168 - Generating config: /etc/hive/conf.server/mapred-site.xml
>>>> 2015-05-04 10:47:31,169 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>>> 2015-05-04 10:47:31,173 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
>>>> 2015-05-04 10:47:31,174 - File['/etc/hive/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>> 2015-05-04 10:47:31,175 - File['/etc/hive/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>> 2015-05-04 10:47:31,177 - File['/etc/hive/conf.server/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>> 2015-05-04 10:47:31,178 - File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>> 2015-05-04 10:47:31,179 - Directory['/etc/hive/conf'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>>>> 2015-05-04 10:47:31,180 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>>> 2015-05-04 10:47:31,202 - Generating config: /etc/hive/conf/mapred-site.xml
>>>> 2015-05-04 10:47:31,203 - File['/etc/hive/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>>> 2015-05-04 10:47:31,207 - Writing File['/etc/hive/conf/mapred-site.xml'] because contents don't match
>>>> 2015-05-04 10:47:31,208 - File['/etc/hive/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>> 2015-05-04 10:47:31,209 - File['/etc/hive/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>>>> 2015-05-04 10:47:31,211 - File['/etc/hive/conf/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>> 2015-05-04 10:47:31,212 - File['/etc/hive/conf/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>>> 2015-05-04 10:47:31,213 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>>> 2015-05-04 10:47:31,236 - Generating config: /etc/hive/conf.server/hive-site.xml
>>>> 2015-05-04 10:47:31,237 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>>> 2015-05-04 10:47:31,245 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
>>>> 2015-05-04 10:47:31,251 - File['/etc/hive/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
>>>> 2015-05-04 10:47:31,253 - Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] {'environment': ..., 'path': ['/bin', '/usr/bin/'], 'creates': '/usr/hdp/current/hive-client/lib/mysql-connector-java.jar', 'not_if': 'test -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'}
>>>> 2015-05-04 10:47:31,282 - Skipping Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] due to not_if
>>>> 2015-05-04 10:47:31,285 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://sv2lxbdp2mst05.corp.equinix.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
>>>> 2015-05-04 10:47:31,370 - File['/var/lib/ambari-agent/data/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
>>>> 2015-05-04 10:47:31,372 - Directory['/var/run/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>>> 2015-05-04 10:47:31,372 - Directory['/var/log/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>>> 2015-05-04 10:47:31,373 - Directory['/var/lib/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>>> 2015-05-04 10:47:31,453 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpTrvIN3'. Return Code: 0, Output: 2.2.4.2-2
>>>> .
>>>> 2015-05-04 10:47:31,531 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpHFa97f'. Return Code: 0, Output: 2.2.4.2-2
>>>> .
>>>> 2015-05-04 10:47:31,600 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.7.0_67 /var/lib/ambari-agent/data/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.log /var/run/hive/hive-server.pid /etc/hive/conf.server /var/log/hive'] {'environment': ..., 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/lib/ambari-server/*:/sbin:/usr/sbin:/bin:/usr/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
>>>> 2015-05-04 10:47:31,693 - Execute['/usr/jdk64/jdk1.7.0_67/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/share/java/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true' hive [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
>>>>
>>>>
>>>
>>
>

Re: Ambari Unable to start Hive server 2 after enabling security

Posted by Robert Levas <rl...@hortonworks.com>.
Hi Shaik…

That is a good question. According to https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2, it doesn’t appear that a kinit is needed before starting up the server.

Rob


From: Shaik M <mu...@gmail.com>>
Reply-To: "user@ambari.apache.org<ma...@ambari.apache.org>" <us...@ambari.apache.org>>
Date: Tuesday, May 5, 2015 at 5:15 AM
To: "user@ambari.apache.org<ma...@ambari.apache.org>" <us...@ambari.apache.org>>
Subject: Re: Ambari Unable to start Hive server 2 after enabling security

Ambari Team,

can you please check, why it is required to do manual "kinit" for HiveServer2 startup?



On 4 May 2015 at 22:24, Shaik M <mu...@gmail.com>> wrote:
Hi,

After verifying the Amabri agent log, I logged in as "hive"  user and I ran below command.

/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs

after that I have started "hiveserver2" service from Amabri and it started the service. Now, I can able to connect to beeline client.

I have re-verified, with destroying the key and the hiveserver2 service went down after destroying the key.

Look like Ambari having some issue to initiate the HiveServer2 keytabs.

-Shaik









On 4 May 2015 at 21:54, Shaik M <mu...@gmail.com>> wrote:
Hi,


I am getting following exception in Hiveserver2 log file:

2015-05-04 13:45:23,542 WARN  [main]: ipc.Client (Client.java:run(676)) - Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2015-05-04 13:45:23,543 INFO  [main]: retry.RetryInvocationHandler (RetryInvocationHandler.java:invoke(140)) - Exception while invoking getFileInfo of class ClientNamenodeProtocolTranslatorPB over sv2lxbdp2mst05.corp.host.com/10.192.149.187:8020<http://sv2lxbdp2mst05.corp.host.com/10.192.149.187:8020> after 8 fail over attempts. Trying to fail over immediately.
java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)];

Thanks,
Shaik

(If it is not right place to post this query, please help me to know the correct group)

On 4 May 2015 at 18:55, Shaik M <mu...@gmail.com>> wrote:
Hi,

I am using Ambari 1.7 and HDP 2.2.4. I have enabled security in this cluster.

After enabling the security Ambari unable start the Hive server 2.

I have verified keytabs and it's working fine. Please find the below ambari string process log below. not shown any output in standerr window.

please let me know the how to resolve this issue.

Thanks,
Shaik M

stdout:   /var/lib/ambari-agent/data/output-1832.txt

2015-05-04 10:47:25,616 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
2015-05-04 10:47:25,646 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
2015-05-04 10:47:25,648 - Execute['rm -f local_policy.jar; rm -f US_export_policy.jar; unzip -o -j -q /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'path': ['/bin/', '/usr/bin'], 'only_if': 'test -e /usr/jdk64/jdk1.7.0_67/jre/lib/security && test -f /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'cwd': '/usr/jdk64/jdk1.7.0_67/jre/lib/security'}
2015-05-04 10:47:25,718 - Group['hadoop'] {'ignore_failures': False}
2015-05-04 10:47:25,720 - Modifying group hadoop
2015-05-04 10:47:25,784 - Group['nobody'] {'ignore_failures': False}
2015-05-04 10:47:25,785 - Modifying group nobody
2015-05-04 10:47:25,839 - Group['users'] {'ignore_failures': False}
2015-05-04 10:47:25,840 - Modifying group users
2015-05-04 10:47:25,886 - Group['nagios'] {'ignore_failures': False}
2015-05-04 10:47:25,887 - Modifying group nagios
2015-05-04 10:47:25,936 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
2015-05-04 10:47:25,937 - Modifying user nobody
2015-05-04 10:47:26,012 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-05-04 10:47:26,013 - Modifying user oozie
2015-05-04 10:47:26,063 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,064 - Modifying user hive
2015-05-04 10:47:26,098 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,099 - Modifying user mapred
2015-05-04 10:47:26,131 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,132 - Modifying user nagios
2015-05-04 10:47:26,165 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-05-04 10:47:26,166 - Modifying user ambari-qa
2015-05-04 10:47:26,199 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,199 - Modifying user zookeeper
2015-05-04 10:47:26,232 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
2015-05-04 10:47:26,233 - Modifying user tez
2015-05-04 10:47:26,265 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,266 - Modifying user hdfs
2015-05-04 10:47:26,298 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,299 - Modifying user sqoop
2015-05-04 10:47:26,332 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,332 - Modifying user hcat
2015-05-04 10:47:26,365 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
2015-05-04 10:47:26,366 - Modifying user yarn
2015-05-04 10:47:26,399 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-05-04 10:47:26,401 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-05-04 10:47:26,431 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
2015-05-04 10:47:26,432 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-05-04 10:47:26,433 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
2015-05-04 10:47:26,462 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-05-04 10:47:26,491 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root'}
2015-05-04 10:47:26,516 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
2015-05-04 10:47:26,577 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'hadoop', 'mode': 0775, 'recursive': True}
2015-05-04 10:47:26,578 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
2015-05-04 10:47:26,579 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
2015-05-04 10:47:26,590 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
2015-05-04 10:47:26,595 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'root'}
2015-05-04 10:47:26,596 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:26,608 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
2015-05-04 10:47:26,609 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2015-05-04 10:47:26,956 - Execute['kill `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1 && rm -f /var/run/hive/hive-server.pid'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1)'}
2015-05-04 10:47:27,068 - HdfsDirectory['/apps/hive/warehouse'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0777, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
2015-05-04 10:47:27,069 - HdfsDirectory['/user/hive'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0700, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
2015-05-04 10:47:27,070 - HdfsDirectory['None'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'action': ['create'], 'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
2015-05-04 10:47:27,073 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs'] {'user': 'hdfs'}
2015-05-04 10:47:27,850 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] {'not_if': "su - hdfs -c 'export PATH=$PATH:/usr/hdp/current/hadoop-client/bin ; hadoop --config /etc/hadoop/conf fs -ls /apps/hive/warehouse /user/hive'", 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
2015-05-04 10:47:31,133 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] due to not_if
2015-05-04 10:47:31,134 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
2015-05-04 10:47:31,135 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
2015-05-04 10:47:31,168 - Generating config: /etc/hive/conf.server/mapred-site.xml
2015-05-04 10:47:31,169 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2015-05-04 10:47:31,173 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
2015-05-04 10:47:31,174 - File['/etc/hive/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,175 - File['/etc/hive/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,177 - File['/etc/hive/conf.server/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,178 - File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,179 - Directory['/etc/hive/conf'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
2015-05-04 10:47:31,180 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
2015-05-04 10:47:31,202 - Generating config: /etc/hive/conf/mapred-site.xml
2015-05-04 10:47:31,203 - File['/etc/hive/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2015-05-04 10:47:31,207 - Writing File['/etc/hive/conf/mapred-site.xml'] because contents don't match
2015-05-04 10:47:31,208 - File['/etc/hive/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,209 - File['/etc/hive/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,211 - File['/etc/hive/conf/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,212 - File['/etc/hive/conf/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
2015-05-04 10:47:31,213 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
2015-05-04 10:47:31,236 - Generating config: /etc/hive/conf.server/hive-site.xml
2015-05-04 10:47:31,237 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2015-05-04 10:47:31,245 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
2015-05-04 10:47:31,251 - File['/etc/hive/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
2015-05-04 10:47:31,253 - Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] {'environment': ..., 'path': ['/bin', '/usr/bin/'], 'creates': '/usr/hdp/current/hive-client/lib/mysql-connector-java.jar', 'not_if': 'test -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'}
2015-05-04 10:47:31,282 - Skipping Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] due to not_if
2015-05-04 10:47:31,285 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://sv2lxbdp2mst05.corp.equinix.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
2015-05-04 10:47:31,370 - File['/var/lib/ambari-agent/data/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
2015-05-04 10:47:31,372 - Directory['/var/run/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
2015-05-04 10:47:31,372 - Directory['/var/log/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
2015-05-04 10:47:31,373 - Directory['/var/lib/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
2015-05-04 10:47:31,453 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpTrvIN3'. Return Code: 0, Output: 2.2.4.2-2
.
2015-05-04 10:47:31,531 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpHFa97f'. Return Code: 0, Output: 2.2.4.2-2
.
2015-05-04 10:47:31,600 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.7.0_67 /var/lib/ambari-agent/data/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.log /var/run/hive/hive-server.pid /etc/hive/conf.server /var/log/hive'] {'environment': ..., 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/lib/ambari-server/*:/sbin:/usr/sbin:/bin:/usr/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
2015-05-04 10:47:31,693 - Execute['/usr/jdk64/jdk1.7.0_67/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/share/java/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true<http://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true>' hive [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}




Re: Ambari Unable to start Hive server 2 after enabling security

Posted by Shaik M <mu...@gmail.com>.
Ambari Team,

can you please check, why it is required to do manual "kinit" for
HiveServer2 startup?



On 4 May 2015 at 22:24, Shaik M <mu...@gmail.com> wrote:

> Hi,
>
> After verifying the Amabri agent log, I logged in as "hive"  user and I
> ran below command.
>
> /usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs
>
> after that I have started "hiveserver2" service from Amabri and it started
> the service. Now, I can able to connect to beeline client.
>
> I have re-verified, with destroying the key and the hiveserver2 service
> went down after destroying the key.
>
> Look like Ambari having some issue to initiate the HiveServer2 keytabs.
>
> -Shaik
>
>
>
>
>
>
>
>
>
> On 4 May 2015 at 21:54, Shaik M <mu...@gmail.com> wrote:
>
>> Hi,
>>
>>
>> *I am getting following exception in Hiveserver2 log file:*
>>
>> 2015-05-04 13:45:23,542 WARN  [main]: ipc.Client (Client.java:run(676)) -
>> Exception encountered while connecting to the server :
>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]
>> 2015-05-04 13:45:23,543 INFO  [main]: retry.RetryInvocationHandler
>> (RetryInvocationHandler.java:invoke(140)) - Exception while invoking
>> getFileInfo of class ClientNamenodeProtocolTranslatorPB over
>> sv2lxbdp2mst05.corp.host.com/10.192.149.187:8020 after 8 fail over
>> attempts. Trying to fail over immediately.
>> java.io.IOException: Failed on local exception: java.io.IOException:
>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)];
>>
>> Thanks,
>> Shaik
>>
>> (If it is not right place to post this query, please help me to know the
>> correct group)
>>
>> On 4 May 2015 at 18:55, Shaik M <mu...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I am using Ambari 1.7 and HDP 2.2.4. I have enabled security in this
>>> cluster.
>>>
>>> After enabling the security Ambari unable start the Hive server 2.
>>>
>>> I have verified keytabs and it's working fine. Please find the below
>>> ambari string process log below. not shown any output in standerr window.
>>>
>>> please let me know the how to resolve this issue.
>>>
>>> Thanks,
>>> Shaik M
>>>
>>> stdout:   /var/lib/ambari-agent/data/output-1832.txt
>>>
>>> 2015-05-04 10:47:25,616 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>>> 2015-05-04 10:47:25,646 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
>>> 2015-05-04 10:47:25,648 - Execute['rm -f local_policy.jar; rm -f US_export_policy.jar; unzip -o -j -q /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'path': ['/bin/', '/usr/bin'], 'only_if': 'test -e /usr/jdk64/jdk1.7.0_67/jre/lib/security && test -f /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'cwd': '/usr/jdk64/jdk1.7.0_67/jre/lib/security'}
>>> 2015-05-04 10:47:25,718 - Group['hadoop'] {'ignore_failures': False}
>>> 2015-05-04 10:47:25,720 - Modifying group hadoop
>>> 2015-05-04 10:47:25,784 - Group['nobody'] {'ignore_failures': False}
>>> 2015-05-04 10:47:25,785 - Modifying group nobody
>>> 2015-05-04 10:47:25,839 - Group['users'] {'ignore_failures': False}
>>> 2015-05-04 10:47:25,840 - Modifying group users
>>> 2015-05-04 10:47:25,886 - Group['nagios'] {'ignore_failures': False}
>>> 2015-05-04 10:47:25,887 - Modifying group nagios
>>> 2015-05-04 10:47:25,936 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
>>> 2015-05-04 10:47:25,937 - Modifying user nobody
>>> 2015-05-04 10:47:26,012 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>> 2015-05-04 10:47:26,013 - Modifying user oozie
>>> 2015-05-04 10:47:26,063 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,064 - Modifying user hive
>>> 2015-05-04 10:47:26,098 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,099 - Modifying user mapred
>>> 2015-05-04 10:47:26,131 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,132 - Modifying user nagios
>>> 2015-05-04 10:47:26,165 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>> 2015-05-04 10:47:26,166 - Modifying user ambari-qa
>>> 2015-05-04 10:47:26,199 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,199 - Modifying user zookeeper
>>> 2015-05-04 10:47:26,232 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>>> 2015-05-04 10:47:26,233 - Modifying user tez
>>> 2015-05-04 10:47:26,265 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,266 - Modifying user hdfs
>>> 2015-05-04 10:47:26,298 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,299 - Modifying user sqoop
>>> 2015-05-04 10:47:26,332 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,332 - Modifying user hcat
>>> 2015-05-04 10:47:26,365 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>>> 2015-05-04 10:47:26,366 - Modifying user yarn
>>> 2015-05-04 10:47:26,399 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>>> 2015-05-04 10:47:26,401 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>>> 2015-05-04 10:47:26,431 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
>>> 2015-05-04 10:47:26,432 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
>>> 2015-05-04 10:47:26,433 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>>> 2015-05-04 10:47:26,462 - Skipping Link['/etc/hadoop/conf'] due to not_if
>>> 2015-05-04 10:47:26,491 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root'}
>>> 2015-05-04 10:47:26,516 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
>>> 2015-05-04 10:47:26,577 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'hadoop', 'mode': 0775, 'recursive': True}
>>> 2015-05-04 10:47:26,578 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
>>> 2015-05-04 10:47:26,579 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
>>> 2015-05-04 10:47:26,590 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
>>> 2015-05-04 10:47:26,595 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'root'}
>>> 2015-05-04 10:47:26,596 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
>>> 2015-05-04 10:47:26,608 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
>>> 2015-05-04 10:47:26,609 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
>>> 2015-05-04 10:47:26,956 - Execute['kill `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1 && rm -f /var/run/hive/hive-server.pid'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1)'}
>>> 2015-05-04 10:47:27,068 - HdfsDirectory['/apps/hive/warehouse'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0777, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>>> 2015-05-04 10:47:27,069 - HdfsDirectory['/user/hive'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0700, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>>> 2015-05-04 10:47:27,070 - HdfsDirectory['None'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'action': ['create'], 'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
>>> 2015-05-04 10:47:27,073 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs'] {'user': 'hdfs'}
>>> 2015-05-04 10:47:27,850 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] {'not_if': "su - hdfs -c 'export PATH=$PATH:/usr/hdp/current/hadoop-client/bin ; hadoop --config /etc/hadoop/conf fs -ls /apps/hive/warehouse /user/hive'", 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>>> 2015-05-04 10:47:31,133 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] due to not_if
>>> 2015-05-04 10:47:31,134 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>>> 2015-05-04 10:47:31,135 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>> 2015-05-04 10:47:31,168 - Generating config: /etc/hive/conf.server/mapred-site.xml
>>> 2015-05-04 10:47:31,169 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>> 2015-05-04 10:47:31,173 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
>>> 2015-05-04 10:47:31,174 - File['/etc/hive/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>>> 2015-05-04 10:47:31,175 - File['/etc/hive/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>>> 2015-05-04 10:47:31,177 - File['/etc/hive/conf.server/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>> 2015-05-04 10:47:31,178 - File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>> 2015-05-04 10:47:31,179 - Directory['/etc/hive/conf'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>>> 2015-05-04 10:47:31,180 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>> 2015-05-04 10:47:31,202 - Generating config: /etc/hive/conf/mapred-site.xml
>>> 2015-05-04 10:47:31,203 - File['/etc/hive/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>> 2015-05-04 10:47:31,207 - Writing File['/etc/hive/conf/mapred-site.xml'] because contents don't match
>>> 2015-05-04 10:47:31,208 - File['/etc/hive/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>>> 2015-05-04 10:47:31,209 - File['/etc/hive/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>>> 2015-05-04 10:47:31,211 - File['/etc/hive/conf/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>> 2015-05-04 10:47:31,212 - File['/etc/hive/conf/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>>> 2015-05-04 10:47:31,213 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>>> 2015-05-04 10:47:31,236 - Generating config: /etc/hive/conf.server/hive-site.xml
>>> 2015-05-04 10:47:31,237 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>>> 2015-05-04 10:47:31,245 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
>>> 2015-05-04 10:47:31,251 - File['/etc/hive/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
>>> 2015-05-04 10:47:31,253 - Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] {'environment': ..., 'path': ['/bin', '/usr/bin/'], 'creates': '/usr/hdp/current/hive-client/lib/mysql-connector-java.jar', 'not_if': 'test -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'}
>>> 2015-05-04 10:47:31,282 - Skipping Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] due to not_if
>>> 2015-05-04 10:47:31,285 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://sv2lxbdp2mst05.corp.equinix.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
>>> 2015-05-04 10:47:31,370 - File['/var/lib/ambari-agent/data/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
>>> 2015-05-04 10:47:31,372 - Directory['/var/run/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>> 2015-05-04 10:47:31,372 - Directory['/var/log/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>> 2015-05-04 10:47:31,373 - Directory['/var/lib/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>>> 2015-05-04 10:47:31,453 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpTrvIN3'. Return Code: 0, Output: 2.2.4.2-2
>>> .
>>> 2015-05-04 10:47:31,531 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpHFa97f'. Return Code: 0, Output: 2.2.4.2-2
>>> .
>>> 2015-05-04 10:47:31,600 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.7.0_67 /var/lib/ambari-agent/data/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.log /var/run/hive/hive-server.pid /etc/hive/conf.server /var/log/hive'] {'environment': ..., 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/lib/ambari-server/*:/sbin:/usr/sbin:/bin:/usr/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
>>> 2015-05-04 10:47:31,693 - Execute['/usr/jdk64/jdk1.7.0_67/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/share/java/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true' hive [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
>>>
>>>
>>
>

Re: Ambari Unable to start Hive server 2 after enabling security

Posted by Shaik M <mu...@gmail.com>.
Hi,

After verifying the Amabri agent log, I logged in as "hive"  user and I ran
below command.

/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs

after that I have started "hiveserver2" service from Amabri and it started
the service. Now, I can able to connect to beeline client.

I have re-verified, with destroying the key and the hiveserver2 service
went down after destroying the key.

Look like Ambari having some issue to initiate the HiveServer2 keytabs.

-Shaik









On 4 May 2015 at 21:54, Shaik M <mu...@gmail.com> wrote:

> Hi,
>
>
> *I am getting following exception in Hiveserver2 log file:*
>
> 2015-05-04 13:45:23,542 WARN  [main]: ipc.Client (Client.java:run(676)) -
> Exception encountered while connecting to the server :
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)]
> 2015-05-04 13:45:23,543 INFO  [main]: retry.RetryInvocationHandler
> (RetryInvocationHandler.java:invoke(140)) - Exception while invoking
> getFileInfo of class ClientNamenodeProtocolTranslatorPB over
> sv2lxbdp2mst05.corp.host.com/10.192.149.187:8020 after 8 fail over
> attempts. Trying to fail over immediately.
> java.io.IOException: Failed on local exception: java.io.IOException:
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)];
>
> Thanks,
> Shaik
>
> (If it is not right place to post this query, please help me to know the
> correct group)
>
> On 4 May 2015 at 18:55, Shaik M <mu...@gmail.com> wrote:
>
>> Hi,
>>
>> I am using Ambari 1.7 and HDP 2.2.4. I have enabled security in this
>> cluster.
>>
>> After enabling the security Ambari unable start the Hive server 2.
>>
>> I have verified keytabs and it's working fine. Please find the below
>> ambari string process log below. not shown any output in standerr window.
>>
>> please let me know the how to resolve this issue.
>>
>> Thanks,
>> Shaik M
>>
>> stdout:   /var/lib/ambari-agent/data/output-1832.txt
>>
>> 2015-05-04 10:47:25,616 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>> 2015-05-04 10:47:25,646 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
>> 2015-05-04 10:47:25,648 - Execute['rm -f local_policy.jar; rm -f US_export_policy.jar; unzip -o -j -q /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'path': ['/bin/', '/usr/bin'], 'only_if': 'test -e /usr/jdk64/jdk1.7.0_67/jre/lib/security && test -f /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'cwd': '/usr/jdk64/jdk1.7.0_67/jre/lib/security'}
>> 2015-05-04 10:47:25,718 - Group['hadoop'] {'ignore_failures': False}
>> 2015-05-04 10:47:25,720 - Modifying group hadoop
>> 2015-05-04 10:47:25,784 - Group['nobody'] {'ignore_failures': False}
>> 2015-05-04 10:47:25,785 - Modifying group nobody
>> 2015-05-04 10:47:25,839 - Group['users'] {'ignore_failures': False}
>> 2015-05-04 10:47:25,840 - Modifying group users
>> 2015-05-04 10:47:25,886 - Group['nagios'] {'ignore_failures': False}
>> 2015-05-04 10:47:25,887 - Modifying group nagios
>> 2015-05-04 10:47:25,936 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
>> 2015-05-04 10:47:25,937 - Modifying user nobody
>> 2015-05-04 10:47:26,012 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>> 2015-05-04 10:47:26,013 - Modifying user oozie
>> 2015-05-04 10:47:26,063 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,064 - Modifying user hive
>> 2015-05-04 10:47:26,098 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,099 - Modifying user mapred
>> 2015-05-04 10:47:26,131 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,132 - Modifying user nagios
>> 2015-05-04 10:47:26,165 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>> 2015-05-04 10:47:26,166 - Modifying user ambari-qa
>> 2015-05-04 10:47:26,199 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,199 - Modifying user zookeeper
>> 2015-05-04 10:47:26,232 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
>> 2015-05-04 10:47:26,233 - Modifying user tez
>> 2015-05-04 10:47:26,265 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,266 - Modifying user hdfs
>> 2015-05-04 10:47:26,298 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,299 - Modifying user sqoop
>> 2015-05-04 10:47:26,332 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,332 - Modifying user hcat
>> 2015-05-04 10:47:26,365 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
>> 2015-05-04 10:47:26,366 - Modifying user yarn
>> 2015-05-04 10:47:26,399 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>> 2015-05-04 10:47:26,401 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>> 2015-05-04 10:47:26,431 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
>> 2015-05-04 10:47:26,432 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
>> 2015-05-04 10:47:26,433 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
>> 2015-05-04 10:47:26,462 - Skipping Link['/etc/hadoop/conf'] due to not_if
>> 2015-05-04 10:47:26,491 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root'}
>> 2015-05-04 10:47:26,516 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
>> 2015-05-04 10:47:26,577 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'hadoop', 'mode': 0775, 'recursive': True}
>> 2015-05-04 10:47:26,578 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
>> 2015-05-04 10:47:26,579 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
>> 2015-05-04 10:47:26,590 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
>> 2015-05-04 10:47:26,595 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'root'}
>> 2015-05-04 10:47:26,596 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
>> 2015-05-04 10:47:26,608 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
>> 2015-05-04 10:47:26,609 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
>> 2015-05-04 10:47:26,956 - Execute['kill `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1 && rm -f /var/run/hive/hive-server.pid'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1)'}
>> 2015-05-04 10:47:27,068 - HdfsDirectory['/apps/hive/warehouse'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0777, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>> 2015-05-04 10:47:27,069 - HdfsDirectory['/user/hive'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0700, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
>> 2015-05-04 10:47:27,070 - HdfsDirectory['None'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'action': ['create'], 'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
>> 2015-05-04 10:47:27,073 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs'] {'user': 'hdfs'}
>> 2015-05-04 10:47:27,850 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] {'not_if': "su - hdfs -c 'export PATH=$PATH:/usr/hdp/current/hadoop-client/bin ; hadoop --config /etc/hadoop/conf fs -ls /apps/hive/warehouse /user/hive'", 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
>> 2015-05-04 10:47:31,133 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] due to not_if
>> 2015-05-04 10:47:31,134 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>> 2015-05-04 10:47:31,135 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>> 2015-05-04 10:47:31,168 - Generating config: /etc/hive/conf.server/mapred-site.xml
>> 2015-05-04 10:47:31,169 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>> 2015-05-04 10:47:31,173 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
>> 2015-05-04 10:47:31,174 - File['/etc/hive/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>> 2015-05-04 10:47:31,175 - File['/etc/hive/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>> 2015-05-04 10:47:31,177 - File['/etc/hive/conf.server/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>> 2015-05-04 10:47:31,178 - File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>> 2015-05-04 10:47:31,179 - Directory['/etc/hive/conf'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
>> 2015-05-04 10:47:31,180 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>> 2015-05-04 10:47:31,202 - Generating config: /etc/hive/conf/mapred-site.xml
>> 2015-05-04 10:47:31,203 - File['/etc/hive/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>> 2015-05-04 10:47:31,207 - Writing File['/etc/hive/conf/mapred-site.xml'] because contents don't match
>> 2015-05-04 10:47:31,208 - File['/etc/hive/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
>> 2015-05-04 10:47:31,209 - File['/etc/hive/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
>> 2015-05-04 10:47:31,211 - File['/etc/hive/conf/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>> 2015-05-04 10:47:31,212 - File['/etc/hive/conf/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
>> 2015-05-04 10:47:31,213 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
>> 2015-05-04 10:47:31,236 - Generating config: /etc/hive/conf.server/hive-site.xml
>> 2015-05-04 10:47:31,237 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
>> 2015-05-04 10:47:31,245 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
>> 2015-05-04 10:47:31,251 - File['/etc/hive/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
>> 2015-05-04 10:47:31,253 - Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] {'environment': ..., 'path': ['/bin', '/usr/bin/'], 'creates': '/usr/hdp/current/hive-client/lib/mysql-connector-java.jar', 'not_if': 'test -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'}
>> 2015-05-04 10:47:31,282 - Skipping Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] due to not_if
>> 2015-05-04 10:47:31,285 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://sv2lxbdp2mst05.corp.equinix.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
>> 2015-05-04 10:47:31,370 - File['/var/lib/ambari-agent/data/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
>> 2015-05-04 10:47:31,372 - Directory['/var/run/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>> 2015-05-04 10:47:31,372 - Directory['/var/log/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>> 2015-05-04 10:47:31,373 - Directory['/var/lib/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
>> 2015-05-04 10:47:31,453 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpTrvIN3'. Return Code: 0, Output: 2.2.4.2-2
>> .
>> 2015-05-04 10:47:31,531 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpHFa97f'. Return Code: 0, Output: 2.2.4.2-2
>> .
>> 2015-05-04 10:47:31,600 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.7.0_67 /var/lib/ambari-agent/data/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.log /var/run/hive/hive-server.pid /etc/hive/conf.server /var/log/hive'] {'environment': ..., 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/lib/ambari-server/*:/sbin:/usr/sbin:/bin:/usr/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
>> 2015-05-04 10:47:31,693 - Execute['/usr/jdk64/jdk1.7.0_67/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/share/java/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true' hive [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
>>
>>
>

Re: Ambari Unable to start Hive server 2 after enabling security

Posted by Shaik M <mu...@gmail.com>.
Hi,


*I am getting following exception in Hiveserver2 log file:*

2015-05-04 13:45:23,542 WARN  [main]: ipc.Client (Client.java:run(676)) -
Exception encountered while connecting to the server :
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]
2015-05-04 13:45:23,543 INFO  [main]: retry.RetryInvocationHandler
(RetryInvocationHandler.java:invoke(140)) - Exception while invoking
getFileInfo of class ClientNamenodeProtocolTranslatorPB over
sv2lxbdp2mst05.corp.host.com/10.192.149.187:8020 after 8 fail over
attempts. Trying to fail over immediately.
java.io.IOException: Failed on local exception: java.io.IOException:
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)];

Thanks,
Shaik

(If it is not right place to post this query, please help me to know the
correct group)

On 4 May 2015 at 18:55, Shaik M <mu...@gmail.com> wrote:

> Hi,
>
> I am using Ambari 1.7 and HDP 2.2.4. I have enabled security in this
> cluster.
>
> After enabling the security Ambari unable start the Hive server 2.
>
> I have verified keytabs and it's working fine. Please find the below
> ambari string process log below. not shown any output in standerr window.
>
> please let me know the how to resolve this issue.
>
> Thanks,
> Shaik M
>
> stdout:   /var/lib/ambari-agent/data/output-1832.txt
>
> 2015-05-04 10:47:25,616 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment': ..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
> 2015-05-04 10:47:25,646 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;     curl -kf -x "" --retry 10     http://sv2lxbdp2mst05.corp.equinix.com:8080/resources//UnlimitedJCEPolicyJDK7.zip -o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
> 2015-05-04 10:47:25,648 - Execute['rm -f local_policy.jar; rm -f US_export_policy.jar; unzip -o -j -q /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'path': ['/bin/', '/usr/bin'], 'only_if': 'test -e /usr/jdk64/jdk1.7.0_67/jre/lib/security && test -f /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip', 'cwd': '/usr/jdk64/jdk1.7.0_67/jre/lib/security'}
> 2015-05-04 10:47:25,718 - Group['hadoop'] {'ignore_failures': False}
> 2015-05-04 10:47:25,720 - Modifying group hadoop
> 2015-05-04 10:47:25,784 - Group['nobody'] {'ignore_failures': False}
> 2015-05-04 10:47:25,785 - Modifying group nobody
> 2015-05-04 10:47:25,839 - Group['users'] {'ignore_failures': False}
> 2015-05-04 10:47:25,840 - Modifying group users
> 2015-05-04 10:47:25,886 - Group['nagios'] {'ignore_failures': False}
> 2015-05-04 10:47:25,887 - Modifying group nagios
> 2015-05-04 10:47:25,936 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}
> 2015-05-04 10:47:25,937 - Modifying user nobody
> 2015-05-04 10:47:26,012 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-05-04 10:47:26,013 - Modifying user oozie
> 2015-05-04 10:47:26,063 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,064 - Modifying user hive
> 2015-05-04 10:47:26,098 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,099 - Modifying user mapred
> 2015-05-04 10:47:26,131 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,132 - Modifying user nagios
> 2015-05-04 10:47:26,165 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-05-04 10:47:26,166 - Modifying user ambari-qa
> 2015-05-04 10:47:26,199 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,199 - Modifying user zookeeper
> 2015-05-04 10:47:26,232 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']}
> 2015-05-04 10:47:26,233 - Modifying user tez
> 2015-05-04 10:47:26,265 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,266 - Modifying user hdfs
> 2015-05-04 10:47:26,298 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,299 - Modifying user sqoop
> 2015-05-04 10:47:26,332 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,332 - Modifying user hcat
> 2015-05-04 10:47:26,365 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}
> 2015-05-04 10:47:26,366 - Modifying user yarn
> 2015-05-04 10:47:26,399 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2015-05-04 10:47:26,401 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
> 2015-05-04 10:47:26,431 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 2>/dev/null'] due to not_if
> 2015-05-04 10:47:26,432 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root', 'recursive': True}
> 2015-05-04 10:47:26,433 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}
> 2015-05-04 10:47:26,462 - Skipping Link['/etc/hadoop/conf'] due to not_if
> 2015-05-04 10:47:26,491 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'root'}
> 2015-05-04 10:47:26,516 - Execute['/bin/echo 0 > /selinux/enforce'] {'only_if': 'test -f /selinux/enforce'}
> 2015-05-04 10:47:26,577 - Directory['/var/log/hadoop'] {'owner': 'root', 'group': 'hadoop', 'mode': 0775, 'recursive': True}
> 2015-05-04 10:47:26,578 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True}
> 2015-05-04 10:47:26,579 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True}
> 2015-05-04 10:47:26,590 - File['/etc/hadoop/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'root'}
> 2015-05-04 10:47:26,595 - File['/etc/hadoop/conf/health_check'] {'content': Template('health_check-v2.j2'), 'owner': 'root'}
> 2015-05-04 10:47:26,596 - File['/etc/hadoop/conf/log4j.properties'] {'content': '...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
> 2015-05-04 10:47:26,608 - File['/etc/hadoop/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
> 2015-05-04 10:47:26,609 - File['/etc/hadoop/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
> 2015-05-04 10:47:26,956 - Execute['kill `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1 && rm -f /var/run/hive/hive-server.pid'] {'not_if': '! (ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1)'}
> 2015-05-04 10:47:27,068 - HdfsDirectory['/apps/hive/warehouse'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0777, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
> 2015-05-04 10:47:27,069 - HdfsDirectory['/user/hive'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'mode': 0700, 'owner': 'hive', 'bin_dir': '/usr/hdp/current/hadoop-client/bin', 'action': ['create_delayed']}
> 2015-05-04 10:47:27,070 - HdfsDirectory['None'] {'security_enabled': True, 'keytab': '/etc/security/keytabs/hdfs.headless.keytab', 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '/usr/bin/kinit', 'action': ['create'], 'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
> 2015-05-04 10:47:27,073 - Execute['/usr/bin/kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs'] {'user': 'hdfs'}
> 2015-05-04 10:47:27,850 - Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] {'not_if': "su - hdfs -c 'export PATH=$PATH:/usr/hdp/current/hadoop-client/bin ; hadoop --config /etc/hadoop/conf fs -ls /apps/hive/warehouse /user/hive'", 'user': 'hdfs', 'path': ['/usr/hdp/current/hadoop-client/bin']}
> 2015-05-04 10:47:31,133 - Skipping Execute['hadoop --config /etc/hadoop/conf fs -mkdir `rpm -q hadoop | grep -q "hadoop-1" || echo "-p"` /apps/hive/warehouse /user/hive && hadoop --config /etc/hadoop/conf fs -chmod  777 /apps/hive/warehouse && hadoop --config /etc/hadoop/conf fs -chmod  700 /user/hive && hadoop --config /etc/hadoop/conf fs -chown  hive /apps/hive/warehouse /user/hive'] due to not_if
> 2015-05-04 10:47:31,134 - Directory['/etc/hive/conf.server'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
> 2015-05-04 10:47:31,135 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
> 2015-05-04 10:47:31,168 - Generating config: /etc/hive/conf.server/mapred-site.xml
> 2015-05-04 10:47:31,169 - File['/etc/hive/conf.server/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
> 2015-05-04 10:47:31,173 - Writing File['/etc/hive/conf.server/mapred-site.xml'] because contents don't match
> 2015-05-04 10:47:31,174 - File['/etc/hive/conf.server/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
> 2015-05-04 10:47:31,175 - File['/etc/hive/conf.server/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
> 2015-05-04 10:47:31,177 - File['/etc/hive/conf.server/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
> 2015-05-04 10:47:31,178 - File['/etc/hive/conf.server/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
> 2015-05-04 10:47:31,179 - Directory['/etc/hive/conf'] {'owner': 'hive', 'group': 'hadoop', 'recursive': True}
> 2015-05-04 10:47:31,180 - XmlConfig['mapred-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
> 2015-05-04 10:47:31,202 - Generating config: /etc/hive/conf/mapred-site.xml
> 2015-05-04 10:47:31,203 - File['/etc/hive/conf/mapred-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
> 2015-05-04 10:47:31,207 - Writing File['/etc/hive/conf/mapred-site.xml'] because contents don't match
> 2015-05-04 10:47:31,208 - File['/etc/hive/conf/hive-default.xml.template'] {'owner': 'hive', 'group': 'hadoop'}
> 2015-05-04 10:47:31,209 - File['/etc/hive/conf/hive-env.sh.template'] {'owner': 'hive', 'group': 'hadoop'}
> 2015-05-04 10:47:31,211 - File['/etc/hive/conf/hive-exec-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
> 2015-05-04 10:47:31,212 - File['/etc/hive/conf/hive-log4j.properties'] {'content': '...', 'owner': 'hive', 'group': 'hadoop', 'mode': 0644}
> 2015-05-04 10:47:31,213 - XmlConfig['hive-site.xml'] {'group': 'hadoop', 'conf_dir': '/etc/hive/conf.server', 'mode': 0644, 'configuration_attributes': ..., 'owner': 'hive', 'configurations': ...}
> 2015-05-04 10:47:31,236 - Generating config: /etc/hive/conf.server/hive-site.xml
> 2015-05-04 10:47:31,237 - File['/etc/hive/conf.server/hive-site.xml'] {'owner': 'hive', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
> 2015-05-04 10:47:31,245 - Writing File['/etc/hive/conf.server/hive-site.xml'] because contents don't match
> 2015-05-04 10:47:31,251 - File['/etc/hive/conf.server/hive-env.sh'] {'content': InlineTemplate(...), 'owner': 'hive', 'group': 'hadoop'}
> 2015-05-04 10:47:31,253 - Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] {'environment': ..., 'path': ['/bin', '/usr/bin/'], 'creates': '/usr/hdp/current/hive-client/lib/mysql-connector-java.jar', 'not_if': 'test -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'}
> 2015-05-04 10:47:31,282 - Skipping Execute['hive mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/ ; rm -f /usr/hdp/current/hive-client/lib/mysql-connector-java.jar ; cp /usr/share/java/mysql-connector-java.jar /usr/hdp/current/hive-client/lib/mysql-connector-java.jar'] due to not_if
> 2015-05-04 10:47:31,285 - Execute['/bin/sh -c 'cd /usr/lib/ambari-agent/ && curl -kf -x "" --retry 5 http://sv2lxbdp2mst05.corp.equinix.com:8080/resources/DBConnectionVerification.jar -o DBConnectionVerification.jar''] {'environment': ..., 'not_if': '[ -f DBConnectionVerification.jar]'}
> 2015-05-04 10:47:31,370 - File['/var/lib/ambari-agent/data/tmp/start_hiveserver2_script'] {'content': Template('startHiveserver2.sh.j2'), 'mode': 0755}
> 2015-05-04 10:47:31,372 - Directory['/var/run/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
> 2015-05-04 10:47:31,372 - Directory['/var/log/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
> 2015-05-04 10:47:31,373 - Directory['/var/lib/hive'] {'owner': 'hive', 'group': 'hadoop', 'mode': 0755, 'recursive': True}
> 2015-05-04 10:47:31,453 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpTrvIN3'. Return Code: 0, Output: 2.2.4.2-2
> .
> 2015-05-04 10:47:31,531 - Could not verify HDP version by calling '/usr/bin/hdp-select versions > /tmp/tmpHFa97f'. Return Code: 0, Output: 2.2.4.2-2
> .
> 2015-05-04 10:47:31,600 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.7.0_67 /var/lib/ambari-agent/data/tmp/start_hiveserver2_script /var/log/hive/hive-server2.out /var/log/hive/hive-server2.log /var/run/hive/hive-server.pid /etc/hive/conf.server /var/log/hive'] {'environment': ..., 'not_if': 'ls /var/run/hive/hive-server.pid >/dev/null 2>&1 && ps `cat /var/run/hive/hive-server.pid` >/dev/null 2>&1', 'user': 'hive', 'path': ['/usr/lib/ambari-server/*:/sbin:/usr/sbin:/bin:/usr/bin:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
> 2015-05-04 10:47:31,693 - Execute['/usr/jdk64/jdk1.7.0_67/bin/java -cp /usr/lib/ambari-agent/DBConnectionVerification.jar:/usr/share/java/mysql-connector-java.jar org.apache.ambari.server.DBConnectionVerification 'jdbc:mysql://sv2lxbdp2mst04.corp.equinix.com/hive?createDatabaseIfNotExist=true' hive [PROTECTED] com.mysql.jdbc.Driver'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 5, 'try_sleep': 10}
>
>