You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by Di Li <di...@ca.ibm.com> on 2015/08/10 22:48:02 UTC
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/
-----------------------------------------------------------
(Updated Aug. 10, 2015, 8:48 p.m.)
Review request for Ambari and Alejandro Fernandez.
Bugs: AMBARI-12349
https://issues.apache.org/jira/browse/AMBARI-12349
Repository: ambari
Description
-------
When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
/etc/hadoop1/conf/dfs_data_dir_mount.hist
/etc/hadoop1/conf/topology_script.py
install failed because datanode could not be start, with error message
Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
Diffs (updated)
-----
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 6b0bff6
ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py df39d2f
Diff: https://reviews.apache.org/r/36519/diff/
Testing
-------
unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
Thanks,
Di Li
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Di Li <di...@ca.ibm.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/
-----------------------------------------------------------
(Updated Aug. 11, 2015, 2:31 p.m.)
Review request for Ambari and Alejandro Fernandez.
Changes
-------
Rebased to be on top of the latest code in trunk.
Bugs: AMBARI-12349
https://issues.apache.org/jira/browse/AMBARI-12349
Repository: ambari
Description
-------
When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
/etc/hadoop1/conf/dfs_data_dir_mount.hist
/etc/hadoop1/conf/topology_script.py
install failed because datanode could not be start, with error message
Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
Diffs (updated)
-----
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 0b18ecb
ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 00022ae
Diff: https://reviews.apache.org/r/36519/diff/
Testing
-------
unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
Thanks,
Di Li
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Di Li <di...@ca.ibm.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/
-----------------------------------------------------------
(Updated Aug. 10, 2015, 9:52 p.m.)
Review request for Ambari and Alejandro Fernandez.
Changes
-------
renamed variable "dir" to "parent_dir"
Bugs: AMBARI-12349
https://issues.apache.org/jira/browse/AMBARI-12349
Repository: ambari
Description
-------
When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
/etc/hadoop1/conf/dfs_data_dir_mount.hist
/etc/hadoop1/conf/topology_script.py
install failed because datanode could not be start, with error message
Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
Diffs (updated)
-----
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 6b0bff6
ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py df39d2f
Diff: https://reviews.apache.org/r/36519/diff/
Testing
-------
unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
Thanks,
Di Li
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Di Li <di...@ca.ibm.com>.
> On Aug. 10, 2015, 9:28 p.m., Alejandro Fernandez wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py, line 28
> > <https://reviews.apache.org/r/36519/diff/2/?file=1036614#file1036614line28>
> >
> > Shouldn't use "dir" as a variable name since it's already a function name.
>
> Alejandro Fernandez wrote:
> Thanks, is this meant for trunk and 2.1.1?
>
> Alejandro Fernandez wrote:
> I tried on trunk, and this will have to be rebased.
>
> Di Li wrote:
> Hello Alejandro,
>
> Sorry for the inconvenience. Yes, this one is meant for trunk and 2.1.1. I was on leave for 2.5 weeks so my code base for the diff is a bit old.
>
> Alejandro Fernandez wrote:
> Di, can you rebase so I can try again on trunk?
> Is this a blocker for branch 2.1.1 (which was already cut) or can it go into release 2.1.2?
>
> Di Li wrote:
> Hello Alenjandro,
>
> yes, I just rebased it to be on top of the latest code in trunk. New patch "AMBARI-12349-rebased.patch" uploaded.
> As for the branches, it's not a blocker, it can just go to the trunk and the 2.1.2 release.
>
> Thanks.
Hello Alenjandro,
I rebased the fix yesterday, per your request. Could you please help take a look see if you can push it to trunk now?
It can go into the trunk and the 2.1.2 release. No need to go into 2.1.1
Thank you.
- Di
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
On Aug. 11, 2015, 2:31 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 11, 2015, 2:31 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 0b18ecb
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 00022ae
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Di Li <di...@ca.ibm.com>.
> On Aug. 10, 2015, 9:28 p.m., Alejandro Fernandez wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py, line 28
> > <https://reviews.apache.org/r/36519/diff/2/?file=1036614#file1036614line28>
> >
> > Shouldn't use "dir" as a variable name since it's already a function name.
>
> Alejandro Fernandez wrote:
> Thanks, is this meant for trunk and 2.1.1?
>
> Alejandro Fernandez wrote:
> I tried on trunk, and this will have to be rebased.
Hello Alejandro,
Sorry for the inconvenience. Yes, this one is meant for trunk and 2.1.1. I was on leave for 2.5 weeks so my code base for the diff is a bit old.
- Di
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
On Aug. 10, 2015, 9:52 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 10, 2015, 9:52 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 6b0bff6
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py df39d2f
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Di Li <di...@ca.ibm.com>.
> On Aug. 10, 2015, 9:28 p.m., Alejandro Fernandez wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py, line 28
> > <https://reviews.apache.org/r/36519/diff/2/?file=1036614#file1036614line28>
> >
> > Shouldn't use "dir" as a variable name since it's already a function name.
>
> Alejandro Fernandez wrote:
> Thanks, is this meant for trunk and 2.1.1?
>
> Alejandro Fernandez wrote:
> I tried on trunk, and this will have to be rebased.
>
> Di Li wrote:
> Hello Alejandro,
>
> Sorry for the inconvenience. Yes, this one is meant for trunk and 2.1.1. I was on leave for 2.5 weeks so my code base for the diff is a bit old.
>
> Alejandro Fernandez wrote:
> Di, can you rebase so I can try again on trunk?
> Is this a blocker for branch 2.1.1 (which was already cut) or can it go into release 2.1.2?
Hello Alenjandro,
yes, I just rebased it to be on top of the latest code in trunk. New patch "AMBARI-12349-rebased.patch" uploaded.
As for the branches, it's not a blocker, it can just go to the trunk and the 2.1.2 release.
Thanks.
- Di
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
On Aug. 11, 2015, 2:31 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 11, 2015, 2:31 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 0b18ecb
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 00022ae
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Alejandro Fernandez <af...@hortonworks.com>.
> On Aug. 10, 2015, 9:28 p.m., Alejandro Fernandez wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py, line 28
> > <https://reviews.apache.org/r/36519/diff/2/?file=1036614#file1036614line28>
> >
> > Shouldn't use "dir" as a variable name since it's already a function name.
>
> Alejandro Fernandez wrote:
> Thanks, is this meant for trunk and 2.1.1?
I tried on trunk, and this will have to be rebased.
- Alejandro
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
On Aug. 10, 2015, 9:52 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 10, 2015, 9:52 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 6b0bff6
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py df39d2f
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Alejandro Fernandez <af...@hortonworks.com>.
> On Aug. 10, 2015, 9:28 p.m., Alejandro Fernandez wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py, line 28
> > <https://reviews.apache.org/r/36519/diff/2/?file=1036614#file1036614line28>
> >
> > Shouldn't use "dir" as a variable name since it's already a function name.
Thanks, is this meant for trunk and 2.1.1?
- Alejandro
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
On Aug. 10, 2015, 9:52 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 10, 2015, 9:52 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 6b0bff6
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py df39d2f
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Alejandro Fernandez <af...@hortonworks.com>.
> On Aug. 10, 2015, 9:28 p.m., Alejandro Fernandez wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py, line 28
> > <https://reviews.apache.org/r/36519/diff/2/?file=1036614#file1036614line28>
> >
> > Shouldn't use "dir" as a variable name since it's already a function name.
>
> Alejandro Fernandez wrote:
> Thanks, is this meant for trunk and 2.1.1?
>
> Alejandro Fernandez wrote:
> I tried on trunk, and this will have to be rebased.
>
> Di Li wrote:
> Hello Alejandro,
>
> Sorry for the inconvenience. Yes, this one is meant for trunk and 2.1.1. I was on leave for 2.5 weeks so my code base for the diff is a bit old.
Di, can you rebase so I can try again on trunk?
Is this a blocker for branch 2.1.1 (which was already cut) or can it go into release 2.1.2?
- Alejandro
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
On Aug. 10, 2015, 9:52 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 10, 2015, 9:52 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 6b0bff6
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py df39d2f
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Alejandro Fernandez <af...@hortonworks.com>.
> On Aug. 10, 2015, 9:28 p.m., Alejandro Fernandez wrote:
> > ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py, line 28
> > <https://reviews.apache.org/r/36519/diff/2/?file=1036614#file1036614line28>
> >
> > Shouldn't use "dir" as a variable name since it's already a function name.
>
> Alejandro Fernandez wrote:
> Thanks, is this meant for trunk and 2.1.1?
>
> Alejandro Fernandez wrote:
> I tried on trunk, and this will have to be rebased.
>
> Di Li wrote:
> Hello Alejandro,
>
> Sorry for the inconvenience. Yes, this one is meant for trunk and 2.1.1. I was on leave for 2.5 weeks so my code base for the diff is a bit old.
>
> Alejandro Fernandez wrote:
> Di, can you rebase so I can try again on trunk?
> Is this a blocker for branch 2.1.1 (which was already cut) or can it go into release 2.1.2?
>
> Di Li wrote:
> Hello Alenjandro,
>
> yes, I just rebased it to be on top of the latest code in trunk. New patch "AMBARI-12349-rebased.patch" uploaded.
> As for the branches, it's not a blocker, it can just go to the trunk and the 2.1.2 release.
>
> Thanks.
>
> Di Li wrote:
> Hello Alenjandro,
>
> I rebased the fix yesterday, per your request. Could you please help take a look see if you can push it to trunk now?
> It can go into the trunk and the 2.1.2 release. No need to go into 2.1.1
>
> Thank you.
Will commit right now. Thanks
- Alejandro
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
On Aug. 11, 2015, 2:31 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 11, 2015, 2:31 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 0b18ecb
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py 00022ae
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>
Re: Review Request 36519: AMBARI-12349: Datanode failed to start when
use
non-default dfs.datanode.data.dir.mount.file or net.topology.script.file.name
Posted by Alejandro Fernandez <af...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/36519/#review94800
-----------------------------------------------------------
Ship it!
ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py (line 28)
<https://reviews.apache.org/r/36519/#comment149417>
Shouldn't use "dir" as a variable name since it's already a function name.
- Alejandro Fernandez
On Aug. 10, 2015, 8:48 p.m., Di Li wrote:
>
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/36519/
> -----------------------------------------------------------
>
> (Updated Aug. 10, 2015, 8:48 p.m.)
>
>
> Review request for Ambari and Alejandro Fernandez.
>
>
> Bugs: AMBARI-12349
> https://issues.apache.org/jira/browse/AMBARI-12349
>
>
> Repository: ambari
>
>
> Description
> -------
>
> When configure hadoop from install wizard, set "File that stores mount point" or "net.topology.script.file.name" to a non default location, e.g
>
> /etc/hadoop1/conf/dfs_data_dir_mount.hist
> /etc/hadoop1/conf/topology_script.py
>
> install failed because datanode could not be start, with error message
>
> Applying File['/etc/hadoop1/conf/topology_mappings.data'] failed, parent directory /etc/hadoop1/conf doesn't exist
>
>
> This is because rack_awareness.py only handles the creation of the file but not the parent directory. Default value doesn't have this problem because the directory /etc/hadoop/conf was created in yum install.
>
>
> Diffs
> -----
>
> ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/rack_awareness.py 6b0bff6
> ambari-server/src/test/python/stacks/2.0.6/hooks/before-START/test_before_start.py df39d2f
>
> Diff: https://reviews.apache.org/r/36519/diff/
>
>
> Testing
> -------
>
> unit test: test_hook_refresh_topology_custom_directories (test_before_start.TestHookBeforeStart) ... ok
>
>
> Thanks,
>
> Di Li
>
>