You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by adamphelps <am...@opendns.com> on 2010/10/12 01:27:01 UTC
Finding replicants of an HDFS file
Is there a command that will display which nodes the blocks of a file are
replicated to?
We're prototyping a hadoop cluster and want to perform some failure testing
where we kill the correct combination of nodes to make a file inaccessible,
however I haven't been able to track down a command that will do this.
Thanks
--
View this message in context: http://old.nabble.com/Finding-replicants-of-an-HDFS-file-tp29938911p29938911.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.
Re: Finding replicants of an HDFS file
Posted by Adam Phelps <am...@opendns.com>.
On 10/13/10 8:14 AM, Aaron Myers wrote:
> Hi Adam,
>
> On Mon, Oct 11, 2010 at 8:40 PM, adamphelps<am...@opendns.com> wrote:
>>
>> Is there a command that will display which nodes the blocks of a file are
>> replicated to?
>>
>
> Try the command:
>
> hadoop fsck<path> -locations -blocks -files
>
> That will print details (including locations) for every file under
> <path>. I'm pretty sure you can give it either a single file or a
> directory, in which case it will show you the details for every file
> under that directory.
>
> Hope that helps.
Yes, that does exactly what I needed. Thanks to you and the people that
responded off-list.
- Adam
Re: Finding replicants of an HDFS file
Posted by Aaron Myers <at...@cloudera.com>.
Hi Adam,
On Mon, Oct 11, 2010 at 8:40 PM, adamphelps <am...@opendns.com> wrote:
>
> Is there a command that will display which nodes the blocks of a file are
> replicated to?
>
Try the command:
hadoop fsck <path> -locations -blocks -files
That will print details (including locations) for every file under
<path>. I'm pretty sure you can give it either a single file or a
directory, in which case it will show you the details for every file
under that directory.
Hope that helps.
Aaron