You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by Robert Levas <rl...@hortonworks.com> on 2015/04/17 20:32:14 UTC

Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/
-----------------------------------------------------------

Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.


Bugs: AMBARI-10550
    https://issues.apache.org/jira/browse/AMBARI-10550


Repository: ambari


Description
-------

When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  

The underlying failure in the ambari-agent appears to be:
```
"Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
    NameNode().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
    namenode(action="start", rolling_restart=rolling_restart, env=env)
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
    setup_ranger_hdfs()
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
    hdfs_repo_data = hdfs_repo_properties()
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
    config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
AttributeError: 'module' object has no attribute '_dn_principal_name'"
```

This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  

The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.

**Solution**
This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  

The following variables were renamed:

Offending Name     | Fixed Name
------------------ | -----------------
_dn_principal_name | dn_principal_name
_dn_keytab         | dn_keytab
_dn_principal_name | dn_principal_name
_nn_principal_name | nn_principal_name
_nn_keytab         | nn_keytab
_nn_principal_name | nn_principal_name
_jn_principal_name | jn_principal_name
_jn_keytab         | jn_keytab


Diffs
-----

  ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
  ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 

Diff: https://reviews.apache.org/r/33316/diff/


Testing
-------

Manually tested in cluster with HDFS and Ranger using HDP 2.3.


Thanks,

Robert Levas


Re: Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

Posted by Jonathan Hurley <jh...@hortonworks.com>.

> On April 17, 2015, 5:22 p.m., Andrew Onischuk wrote:
> > There is no private variables in Python. It's just a convention to use _ prefix, for what is not intended to be used outside.
> > see http://stackoverflow.com/questions/1641219/does-python-have-private-variables-in-classes
> > 
> > I'm curious how can that fix the problem. Anyway +1 if it does
> 
> Robert Levas wrote:
>     Hey Andrew... I was hoping you would chime in on this; but I was looking for validation on my _theory_, not the contrary. ;) In anycase, I agree with you but the solution seems to fix the issue. Maybe there is an issue with _ preceeding varible names being imported from a module that previously imported it.  If only one import was at play, I don't think we would have seen the issue.

I don't think that's quite right. _vars are not imported by default unless specifically imported. __vars are obfuscated, essentially making them private. It is convention that _ is used for accessible but not automatically imported (protected) while __ is reserved for private/don't touch.


- Jonathan


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/#review80531
-----------------------------------------------------------


On April 17, 2015, 2:34 p.m., Robert Levas wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/33316/
> -----------------------------------------------------------
> 
> (Updated April 17, 2015, 2:34 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-10550
>     https://issues.apache.org/jira/browse/AMBARI-10550
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  
> 
> The underlying failure in the ambari-agent appears to be:
> ```
> "Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
>     NameNode().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
>     namenode(action="start", rolling_restart=rolling_restart, env=env)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>     return fn(*args, **kwargs)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
>     setup_ranger_hdfs()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
>     hdfs_repo_data = hdfs_repo_properties()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
>     config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
> AttributeError: 'module' object has no attribute '_dn_principal_name'"
> ```
> 
> This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  
> 
> The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.
> 
> **Solution**
> This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  
> 
> The following variables were renamed:
> 
> Offending Name     | Fixed Name
> ------------------ | -----------------
> _dn_principal_name | dn_principal_name
> _dn_keytab         | dn_keytab
> _dn_principal_name | dn_principal_name
> _nn_principal_name | nn_principal_name
> _nn_keytab         | nn_keytab
> _nn_principal_name | nn_principal_name
> _jn_principal_name | jn_principal_name
> _jn_keytab         | jn_keytab
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 
> 
> Diff: https://reviews.apache.org/r/33316/diff/
> 
> 
> Testing
> -------
> 
> Manually tested in cluster with HDFS and Ranger using HDP 2.3.
> 
> **Jenkins test results**
> Failed due to missing artifact.
> 
> 
> Thanks,
> 
> Robert Levas
> 
>


Re: Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

Posted by Robert Levas <rl...@hortonworks.com>.

> On April 17, 2015, 5:22 p.m., Andrew Onischuk wrote:
> > There is no private variables in Python. It's just a convention to use _ prefix, for what is not intended to be used outside.
> > see http://stackoverflow.com/questions/1641219/does-python-have-private-variables-in-classes
> > 
> > I'm curious how can that fix the problem. Anyway +1 if it does
> 
> Robert Levas wrote:
>     Hey Andrew... I was hoping you would chime in on this; but I was looking for validation on my _theory_, not the contrary. ;) In anycase, I agree with you but the solution seems to fix the issue. Maybe there is an issue with _ preceeding varible names being imported from a module that previously imported it.  If only one import was at play, I don't think we would have seen the issue.
> 
> Jonathan Hurley wrote:
>     I don't think that's quite right. _vars are not imported by default unless specifically imported. __vars are obfuscated, essentially making them private. It is convention that _ is used for accessible but not automatically imported (protected) while __ is reserved for private/don't touch.

Thanks Jonathan... I am glad you chimed in as well.  

I have noticed that _vars are imported but not _exported_.  So if moduleA contains variable _variableA and variableA; and moduleB imports moduleA, moduleB can see/use both _variableA and variableA. However if moduleC imports moduleB, it cannot see/use _variableA but can see/use variableA.


- Robert


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/#review80531
-----------------------------------------------------------


On April 17, 2015, 2:34 p.m., Robert Levas wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/33316/
> -----------------------------------------------------------
> 
> (Updated April 17, 2015, 2:34 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-10550
>     https://issues.apache.org/jira/browse/AMBARI-10550
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  
> 
> The underlying failure in the ambari-agent appears to be:
> ```
> "Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
>     NameNode().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
>     namenode(action="start", rolling_restart=rolling_restart, env=env)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>     return fn(*args, **kwargs)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
>     setup_ranger_hdfs()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
>     hdfs_repo_data = hdfs_repo_properties()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
>     config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
> AttributeError: 'module' object has no attribute '_dn_principal_name'"
> ```
> 
> This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  
> 
> The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.
> 
> **Solution**
> This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  
> 
> The following variables were renamed:
> 
> Offending Name     | Fixed Name
> ------------------ | -----------------
> _dn_principal_name | dn_principal_name
> _dn_keytab         | dn_keytab
> _dn_principal_name | dn_principal_name
> _nn_principal_name | nn_principal_name
> _nn_keytab         | nn_keytab
> _nn_principal_name | nn_principal_name
> _jn_principal_name | jn_principal_name
> _jn_keytab         | jn_keytab
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 
> 
> Diff: https://reviews.apache.org/r/33316/diff/
> 
> 
> Testing
> -------
> 
> Manually tested in cluster with HDFS and Ranger using HDP 2.3.
> 
> **Jenkins test results**
> Failed due to missing artifact.
> 
> 
> Thanks,
> 
> Robert Levas
> 
>


Re: Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

Posted by Robert Levas <rl...@hortonworks.com>.

> On April 17, 2015, 5:22 p.m., Andrew Onischuk wrote:
> > There is no private variables in Python. It's just a convention to use _ prefix, for what is not intended to be used outside.
> > see http://stackoverflow.com/questions/1641219/does-python-have-private-variables-in-classes
> > 
> > I'm curious how can that fix the problem. Anyway +1 if it does

Hey Andrew... I was hoping you would chime in on this; but I was looking for validation on my _theory_, not the contrary. ;) In anycase, I agree with you but the solution seems to fix the issue. Maybe there is an issue with _ preceeding varible names being imported from a module that previously imported it.  If only one import was at play, I don't think we would have seen the issue.


- Robert


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/#review80531
-----------------------------------------------------------


On April 17, 2015, 2:34 p.m., Robert Levas wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/33316/
> -----------------------------------------------------------
> 
> (Updated April 17, 2015, 2:34 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-10550
>     https://issues.apache.org/jira/browse/AMBARI-10550
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  
> 
> The underlying failure in the ambari-agent appears to be:
> ```
> "Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
>     NameNode().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
>     namenode(action="start", rolling_restart=rolling_restart, env=env)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>     return fn(*args, **kwargs)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
>     setup_ranger_hdfs()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
>     hdfs_repo_data = hdfs_repo_properties()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
>     config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
> AttributeError: 'module' object has no attribute '_dn_principal_name'"
> ```
> 
> This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  
> 
> The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.
> 
> **Solution**
> This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  
> 
> The following variables were renamed:
> 
> Offending Name     | Fixed Name
> ------------------ | -----------------
> _dn_principal_name | dn_principal_name
> _dn_keytab         | dn_keytab
> _dn_principal_name | dn_principal_name
> _nn_principal_name | nn_principal_name
> _nn_keytab         | nn_keytab
> _nn_principal_name | nn_principal_name
> _jn_principal_name | jn_principal_name
> _jn_keytab         | jn_keytab
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 
> 
> Diff: https://reviews.apache.org/r/33316/diff/
> 
> 
> Testing
> -------
> 
> Manually tested in cluster with HDFS and Ranger using HDP 2.3.
> 
> **Jenkins test results**
> Failed due to missing artifact.
> 
> 
> Thanks,
> 
> Robert Levas
> 
>


Re: Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

Posted by Andrew Onischuk <ao...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/#review80531
-----------------------------------------------------------

Ship it!


There is no private variables in Python. It's just a convention to use _ prefix, for what is not intended to be used outside.
see http://stackoverflow.com/questions/1641219/does-python-have-private-variables-in-classes

I'm curious how can that fix the problem. Anyway +1 if it does

- Andrew Onischuk


On April 17, 2015, 6:34 p.m., Robert Levas wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/33316/
> -----------------------------------------------------------
> 
> (Updated April 17, 2015, 6:34 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-10550
>     https://issues.apache.org/jira/browse/AMBARI-10550
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  
> 
> The underlying failure in the ambari-agent appears to be:
> ```
> "Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
>     NameNode().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
>     namenode(action="start", rolling_restart=rolling_restart, env=env)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>     return fn(*args, **kwargs)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
>     setup_ranger_hdfs()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
>     hdfs_repo_data = hdfs_repo_properties()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
>     config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
> AttributeError: 'module' object has no attribute '_dn_principal_name'"
> ```
> 
> This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  
> 
> The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.
> 
> **Solution**
> This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  
> 
> The following variables were renamed:
> 
> Offending Name     | Fixed Name
> ------------------ | -----------------
> _dn_principal_name | dn_principal_name
> _dn_keytab         | dn_keytab
> _dn_principal_name | dn_principal_name
> _nn_principal_name | nn_principal_name
> _nn_keytab         | nn_keytab
> _nn_principal_name | nn_principal_name
> _jn_principal_name | jn_principal_name
> _jn_keytab         | jn_keytab
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 
> 
> Diff: https://reviews.apache.org/r/33316/diff/
> 
> 
> Testing
> -------
> 
> Manually tested in cluster with HDFS and Ranger using HDP 2.3.
> 
> **Jenkins test results**
> Failed due to missing artifact.
> 
> 
> Thanks,
> 
> Robert Levas
> 
>


Re: Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

Posted by Jonathan Hurley <jh...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/#review80565
-----------------------------------------------------------

Ship it!


Ship It!

- Jonathan Hurley


On April 17, 2015, 2:34 p.m., Robert Levas wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/33316/
> -----------------------------------------------------------
> 
> (Updated April 17, 2015, 2:34 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-10550
>     https://issues.apache.org/jira/browse/AMBARI-10550
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  
> 
> The underlying failure in the ambari-agent appears to be:
> ```
> "Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
>     NameNode().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
>     namenode(action="start", rolling_restart=rolling_restart, env=env)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>     return fn(*args, **kwargs)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
>     setup_ranger_hdfs()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
>     hdfs_repo_data = hdfs_repo_properties()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
>     config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
> AttributeError: 'module' object has no attribute '_dn_principal_name'"
> ```
> 
> This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  
> 
> The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.
> 
> **Solution**
> This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  
> 
> The following variables were renamed:
> 
> Offending Name     | Fixed Name
> ------------------ | -----------------
> _dn_principal_name | dn_principal_name
> _dn_keytab         | dn_keytab
> _dn_principal_name | dn_principal_name
> _nn_principal_name | nn_principal_name
> _nn_keytab         | nn_keytab
> _nn_principal_name | nn_principal_name
> _jn_principal_name | jn_principal_name
> _jn_keytab         | jn_keytab
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 
> 
> Diff: https://reviews.apache.org/r/33316/diff/
> 
> 
> Testing
> -------
> 
> Manually tested in cluster with HDFS and Ranger using HDP 2.3.
> 
> **Jenkins test results**
> Failed due to missing artifact.
> 
> 
> Thanks,
> 
> Robert Levas
> 
>


Re: Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

Posted by Alejandro Fernandez <af...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/#review80515
-----------------------------------------------------------

Ship it!


Ship It!

- Alejandro Fernandez


On April 17, 2015, 6:34 p.m., Robert Levas wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/33316/
> -----------------------------------------------------------
> 
> (Updated April 17, 2015, 6:34 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-10550
>     https://issues.apache.org/jira/browse/AMBARI-10550
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  
> 
> The underlying failure in the ambari-agent appears to be:
> ```
> "Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
>     NameNode().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
>     namenode(action="start", rolling_restart=rolling_restart, env=env)
>   File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
>     return fn(*args, **kwargs)
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
>     setup_ranger_hdfs()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
>     hdfs_repo_data = hdfs_repo_properties()
>   File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
>     config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
> AttributeError: 'module' object has no attribute '_dn_principal_name'"
> ```
> 
> This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  
> 
> The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.
> 
> **Solution**
> This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  
> 
> The following variables were renamed:
> 
> Offending Name     | Fixed Name
> ------------------ | -----------------
> _dn_principal_name | dn_principal_name
> _dn_keytab         | dn_keytab
> _dn_principal_name | dn_principal_name
> _nn_principal_name | nn_principal_name
> _nn_keytab         | nn_keytab
> _nn_principal_name | nn_principal_name
> _jn_principal_name | jn_principal_name
> _jn_keytab         | jn_keytab
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
>   ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 
> 
> Diff: https://reviews.apache.org/r/33316/diff/
> 
> 
> Testing
> -------
> 
> Manually tested in cluster with HDFS and Ranger using HDP 2.3.
> 
> **Jenkins test results**
> Failed due to missing artifact.
> 
> 
> Thanks,
> 
> Robert Levas
> 
>


Re: Review Request 33316: NameNode Restart fails after attempt to Kerberize Cluster

Posted by Robert Levas <rl...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33316/
-----------------------------------------------------------

(Updated April 17, 2015, 2:34 p.m.)


Review request for Ambari, Alejandro Fernandez, Andrew Onischuk, Jonathan Hurley, and Vitalyi Brodetskyi.


Bugs: AMBARI-10550
    https://issues.apache.org/jira/browse/AMBARI-10550


Repository: ambari


Description
-------

When attempting to restart the HDFS NameNode after running the Kerberos wizard to enable Kerberos, the NameNode fails to startup.  

The underlying failure in the ambari-agent appears to be:
```
"Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 298, in <module>
    NameNode().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 214, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 72, in start
    namenode(action="start", rolling_restart=rolling_restart, env=env)
  File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_namenode.py", line 38, in namenode
    setup_ranger_hdfs()
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 66, in setup_ranger_hdfs
    hdfs_repo_data = hdfs_repo_properties()
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py", line 194, in hdfs_repo_properties
    config_dict['dfs.datanode.kerberos.principal'] = params._dn_principal_name
AttributeError: 'module' object has no attribute '_dn_principal_name'"
```

This keeps the HDFS NameNode from starting up properly after Kerberos is Enabled, and this seems to keep the process of Enabling Kerberos from completing.  

The problem appears to be a Python coding issue where _private_ variables (declared with a leading underscore) are not imported from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.

**Solution**
This is a basic python coding issue where _private_ variables are not imported into the offending module - `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  By removing the underscore from the _private_ variables, the varibales become _public_ and are then able to be imported from from `common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py` into `common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py`.  

The following variables were renamed:

Offending Name     | Fixed Name
------------------ | -----------------
_dn_principal_name | dn_principal_name
_dn_keytab         | dn_keytab
_dn_principal_name | dn_principal_name
_nn_principal_name | nn_principal_name
_nn_keytab         | nn_keytab
_nn_principal_name | nn_principal_name
_jn_principal_name | jn_principal_name
_jn_keytab         | jn_keytab


Diffs
-----

  ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/params_linux.py b0e100f 
  ambari-server/src/main/resources/common-services/HDFS/2.1.0.2.0/package/scripts/setup_ranger_hdfs.py 9413d8e 

Diff: https://reviews.apache.org/r/33316/diff/


Testing (updated)
-------

Manually tested in cluster with HDFS and Ranger using HDP 2.3.

**Jenkins test results**
Failed due to missing artifact.


Thanks,

Robert Levas