You are viewing a plain text version of this content. The canonical link for it is here.
Posted to general@hadoop.apache.org by Clement Jebakumar <je...@gmail.com> on 2010/11/05 02:47:57 UTC
Keeping Different folders for Different Hadoop Datanodes
Hello,
Is there a way to specify different folder paths to different datanode
instances in Hadoop?
I need to attach a another harddisk to a PC and want to use it for hadoop.
Is it possible?
*Clement Jebakumar,*
111/27 Keelamutharamman Kovil Street,
Tenkasi, 627 811
http://www.declum.com/clement.html
Re: Keeping Different folders for Different Hadoop Datanodes
Posted by Harsh J <qw...@gmail.com>.
Hi,
On Fri, Nov 5, 2010 at 7:17 AM, Clement Jebakumar <je...@gmail.com> wrote:
> Hello,
>
> Is there a way to specify different folder paths to different datanode
> instances in Hadoop?
The property "dfs.data.dir" can be a comma separated list of paths,
enabling a DataNode to use more than one mount point for itself.
To run multiple DataNode instances itself on a single machine (which
isn't a good idea in production), see this discussion:
http://search-hadoop.com/m/sApJY1zWgQV
> I need to attach a another harddisk to a PC and want to use it for hadoop.
> Is it possible?
Yes, just modify the "dfs.data.dir" property to reflect the new mount
point location and restart your DataNode. Copy a selection of data
already residing on your HDFS, to balance things between the disks
(you can delete the older copy of those files after the operation).
>
> *Clement Jebakumar,*
> 111/27 Keelamutharamman Kovil Street,
> Tenkasi, 627 811
> http://www.declum.com/clement.html
>
--
Harsh J
www.harshj.com