You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Robert Levas (JIRA)" <ji...@apache.org> on 2014/11/29 00:33:12 UTC

[jira] [Updated] (AMBARI-8477) HDFS service components should indicate security state

     [ https://issues.apache.org/jira/browse/AMBARI-8477?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Robert Levas updated AMBARI-8477:
---------------------------------
    Description: 
The HDFS service components should indicate security state when queried by Ambari Agent via STATUS_COMMAND.  Each component should determine it's state as follows:

h3. NAMENODE
h4. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.namenode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.namenode.kerberos.principal
*** not empty
*** required
** dfs.namenode.kerberos.https.principal
*** not empty
*** required

h4. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(namenode principal) && kinit(https principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h3. DATANODE
h4. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.datanode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.datanode.kerberos.principal
*** not empty
*** required
** dfs.datanode.kerberos.https.principal
*** not empty
*** required

h4. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(datanode principal) && kinit(https principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h3. SECONDARY_NAMENODE
h4. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.namenode.secondary.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.namenode.secondary.kerberos.principal
*** not empty
*** required
** dfs.namenode.secondary.kerberos.https.principal
*** not empty
*** required

h4. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(namenode principal) && kinit(https principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h3. HDFS_CLIENT
h4. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.web.authentication.kerberos.keytab
*** not empty
*** path exists and is readable
*** required
** dfs.web.authentication.kerberos.principal
*** not empty
*** required

h4. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(hdfs web principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h3. JOURNALNODE
h4. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required

h4. Pseudocode
{code}
if indicators imply security is on and validate
    state = SECURED_KERBEROS
else
    state = UNSECURED
{code}

h3. ZKFC
h4. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required

h4. Pseudocode
{code}
if indicators imply security is on and validate
    state = SECURED_KERBEROS
else
    state = UNSECURED
{code}

_*Note*_: Due to the _cost_ of calling {{kinit}} results should be cached for a period of time before retrying.  This may be an issue depending on the frequency of the heartbeat timeout.

  was:
The HDFS service components should indicate security state when queried by Ambari Agent via STATUS_COMMAND.  Each component should determine it's state as follows:

h2. NAMENODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.namenode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.namenode.kerberos.principal
*** not empty
*** required
** dfs.namenode.kerberos.https.principal
*** not empty
*** required

h3. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(namenode principal) && kinit(https principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h2. DATANODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.datanode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.datanode.kerberos.principal
*** not empty
*** required
** dfs.datanode.kerberos.https.principal
*** not empty
*** required

h3. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(datanode principal) && kinit(https principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h2. SECONDARY_NAMENODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.namenode.secondary.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.namenode.secondary.kerberos.principal
*** not empty
*** required
** dfs.namenode.secondary.kerberos.https.principal
*** not empty
*** required

h3. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(namenode principal) && kinit(https principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h2. HDFS_CLIENT
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /etc/hadoop/conf/hdfs-site.xml
** dfs.web.authentication.kerberos.keytab
*** not empty
*** path exists and is readable
*** required
** dfs.web.authentication.kerberos.principal
*** not empty
*** required

h3. Pseudocode
{code}
if indicators imply security is on and validate
    if kinit(hdfs web principal) succeeds
        state = SECURED_KERBEROS
    else
        state = ERROR 
else
    state = UNSECURED
{code}

h2. JOURNALNODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required

h3. Pseudocode
{code}
if indicators imply security is on and validate
    state = SECURED_KERBEROS
else
    state = UNSECURED
{code}

h2. ZKFC
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled'] 
*** = “true”
* Configuration File: /etc/hadoop/conf/core-site.xml
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required

h3. Pseudocode
{code}
if indicators imply security is on and validate
    state = SECURED_KERBEROS
else
    state = UNSECURED
{code}

_*Note*_: Due to the _cost_ of calling {{kinit}} results should be cached for a period of time before retrying.  This may be an issue depending on the frequency of the heartbeat timeout.


> HDFS service components should indicate security state
> ------------------------------------------------------
>
>                 Key: AMBARI-8477
>                 URL: https://issues.apache.org/jira/browse/AMBARI-8477
>             Project: Ambari
>          Issue Type: Improvement
>          Components: ambari-server, stacks
>    Affects Versions: 2.0.0
>            Reporter: Robert Levas
>            Assignee: Robert Levas
>              Labels: agent, kerberos, lifecycle, security
>             Fix For: 2.0.0
>
>
> The HDFS service components should indicate security state when queried by Ambari Agent via STATUS_COMMAND.  Each component should determine it's state as follows:
> h3. NAMENODE
> h4. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled'] 
> *** = “true”
> * Configuration File: /etc/hadoop/conf/core-site.xml
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: /etc/hadoop/conf/hdfs-site.xml
> ** dfs.namenode.keytab.file
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.namenode.kerberos.principal
> *** not empty
> *** required
> ** dfs.namenode.kerberos.https.principal
> *** not empty
> *** required
> h4. Pseudocode
> {code}
> if indicators imply security is on and validate
>     if kinit(namenode principal) && kinit(https principal) succeeds
>         state = SECURED_KERBEROS
>     else
>         state = ERROR 
> else
>     state = UNSECURED
> {code}
> h3. DATANODE
> h4. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled'] 
> *** = “true”
> * Configuration File: /etc/hadoop/conf/core-site.xml
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: /etc/hadoop/conf/hdfs-site.xml
> ** dfs.datanode.keytab.file
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.datanode.kerberos.principal
> *** not empty
> *** required
> ** dfs.datanode.kerberos.https.principal
> *** not empty
> *** required
> h4. Pseudocode
> {code}
> if indicators imply security is on and validate
>     if kinit(datanode principal) && kinit(https principal) succeeds
>         state = SECURED_KERBEROS
>     else
>         state = ERROR 
> else
>     state = UNSECURED
> {code}
> h3. SECONDARY_NAMENODE
> h4. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled'] 
> *** = “true”
> * Configuration File: /etc/hadoop/conf/core-site.xml
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: /etc/hadoop/conf/hdfs-site.xml
> ** dfs.namenode.secondary.keytab.file
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.namenode.secondary.kerberos.principal
> *** not empty
> *** required
> ** dfs.namenode.secondary.kerberos.https.principal
> *** not empty
> *** required
> h4. Pseudocode
> {code}
> if indicators imply security is on and validate
>     if kinit(namenode principal) && kinit(https principal) succeeds
>         state = SECURED_KERBEROS
>     else
>         state = ERROR 
> else
>     state = UNSECURED
> {code}
> h3. HDFS_CLIENT
> h4. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled'] 
> *** = “true”
> * Configuration File: /etc/hadoop/conf/core-site.xml
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: /etc/hadoop/conf/hdfs-site.xml
> ** dfs.web.authentication.kerberos.keytab
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.web.authentication.kerberos.principal
> *** not empty
> *** required
> h4. Pseudocode
> {code}
> if indicators imply security is on and validate
>     if kinit(hdfs web principal) succeeds
>         state = SECURED_KERBEROS
>     else
>         state = ERROR 
> else
>     state = UNSECURED
> {code}
> h3. JOURNALNODE
> h4. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled'] 
> *** = “true”
> * Configuration File: /etc/hadoop/conf/core-site.xml
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> h4. Pseudocode
> {code}
> if indicators imply security is on and validate
>     state = SECURED_KERBEROS
> else
>     state = UNSECURED
> {code}
> h3. ZKFC
> h4. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled'] 
> *** = “true”
> * Configuration File: /etc/hadoop/conf/core-site.xml
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> h4. Pseudocode
> {code}
> if indicators imply security is on and validate
>     state = SECURED_KERBEROS
> else
>     state = UNSECURED
> {code}
> _*Note*_: Due to the _cost_ of calling {{kinit}} results should be cached for a period of time before retrying.  This may be an issue depending on the frequency of the heartbeat timeout.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)