You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ambari.apache.org by "lijian@cnic.cn" <li...@cnic.cn> on 2015/07/02 05:02:51 UTC

Add new directories to Dadanode with ambari,Thanks

Hi,
I have troubles when I add new directories to Datanode with ambari.
After I add a new directory to the Datanode configure file and restart services, I upload many data to hdfs. However, the new directory is not used,and the old directory is full.
I have tried to modify the parameter "Reserved space for HDFS", but it has no effect at all.
And I also find another tip from Hadoop faq:
3.12. On an individual data node, how do you balance the blocks on the disk?
Hadoop currently does not have a method by which to do this automatically. To do this manually:
Take down the HDFS
Use the UNIX mv command to move the individual blocks and meta pairs from one directory to another on each host
Restart the HDFS
I want to know if there is any method to fix up this problem more effectively?
Thanks a lot.



lijian@cnic.cn