You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Margusja <ma...@roo.ee> on 2014/10/14 15:54:23 UTC

The filesystem under path '/' has n CORRUPT files

Hi

I am playing with hadoop-2 filesystem. I have two namenodes with HA and 
six datanodes.
I tried different configurations and killed namenodes ans so on...
Now I have situation where most of my data are there but some corrupted 
blocks exists.
hdfs fsck / - gives my loads of Under replicated blocks. Will they 
recover? My replica factor is 3.
Filespystem Status: HEALTHY
Via Web UI I see many missing blocks message.

hdfs fsck / -list-corruptfileblocks gives me many corrupted blocks.
In example - blk_1073745897  /user/hue/cdr/2014/12/10/table10.csv
[hdfs@bigdata1 dfs]$ hdfs fsck /user/hue/cdr/2014/12/10/table10.csv 
-files -locations -blocks
Connecting to namenode via http://namenode1:50070
FSCK started by hdfs (auth:SIMPLE) from /192.168.81.108 for path 
/user/hue/cdr/2014/12/10/table10.csv at Tue Oct 14 16:51:55 EEST 2014
Path '/user/hue/cdr/2014/12/10/table10.csv' does not exist

As I understand There is nothing to do.
Tried to delete it
[hdfs@bigdata1 dfs]$ hdfs dfs -rm /user/hue/cdr/2014/12/10/table10.csv
rm: `/user/hue/cdr/2014/12/10/table10.csv': No such file or directory

So what sould I do?

-- 
Best regards, Margus (Margusja) Roo
+372 51 48 780
http://margus.roo.ee
http://ee.linkedin.com/in/margusroo
skype: margusja
ldapsearch -x -h ldap.sk.ee -b c=EE "(serialNumber=37303140314)"