You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Andrew Onischuk (JIRA)" <ji...@apache.org> on 2015/03/23 12:30:10 UTC
[jira] [Resolved] (AMBARI-10154) Hive Ambari install is missing a
limits.d/hive
[ https://issues.apache.org/jira/browse/AMBARI-10154?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Onischuk resolved AMBARI-10154.
--------------------------------------
Resolution: Fixed
Committed to trunk
> Hive Ambari install is missing a limits.d/hive
> ----------------------------------------------
>
> Key: AMBARI-10154
> URL: https://issues.apache.org/jira/browse/AMBARI-10154
> Project: Ambari
> Issue Type: Bug
> Reporter: Andrew Onischuk
> Assignee: Andrew Onischuk
> Fix For: 2.1.0
>
>
> Ambari installs are missing the the Linux configuration files for the open-
> files/max-processes for HiveServer2.
> In comparison HDFS ships such a required file
> /etc/security/limits.d/hdfs.conf
> which contains the ulimit exceptions for nofile/nproc variables (similar to
> the ones listed in this doc item)
> <https://ambari.apache.org/1.2.0/installing-hadoop-using-ambari/content
> /ambari-chap5-3-1.html>
> An equivalent /etc/security/limits.d/hive.conf is required for HiveServer2 to
> avoid similar issues.
> This is a purely installed issue, with no corresponding Hive changes in
> Apache.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)