You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Sudhir Vallamkondu <Su...@icrossing.com> on 2010/10/04 18:28:33 UTC

Re: Total Space Available on Hadoop Cluster Or Hadoop version of "df"

fs -du has a 'h' option for human readble values but it doesn't seem to
work. Instead you can use something like this to print in gigs. Adjust the
1024 multiplier for other forms.

hadoop fs -du / | awk '{print ($1/(1024*1024*1024))"g" "\t" $2}'



On 10/4/10 2:04 AM, "common-user-digest-help@hadoop.apache.org"
<co...@hadoop.apache.org> wrote:

> From: Sandhya E <sa...@gmail.com>
> Date: Sat, 2 Oct 2010 23:36:55 +0530
> To: <co...@hadoop.apache.org>
> Subject: Re: Total Space Available on Hadoop Cluster Or Hadoop version of
> "df".
> 
> There is a fs -du command that can be useful. Or the Hadoop DFS
> website shows the stats also.
> 
> On Sat, Oct 2, 2010 at 9:44 AM, rahul <rm...@apple.com> wrote:
>> Hi,
>> 
>> I am using Hadoop 0.20.2 version for data processing by setting up Hadoop
>> Cluster on two nodes.
>> 
>> And I am continuously adding more space to the nodes.
>> 
>> Can some body let me know how to get the total space available on the hadoop
>> cluster using command line.
>> 
>>  or
>> 
>> Hadoop version "df", Unix command.
>> 
>> Any input is helpful.
>> 
>> Thanks
>> Rahul


iCrossing Privileged and Confidential Information
This email message is for the sole use of the intended recipient(s) and may contain confidential and privileged information of iCrossing. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message.