You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "Jonathan Hsieh (JIRA)" <ji...@apache.org> on 2012/08/29 21:14:08 UTC

[jira] [Created] (HBASE-6686) HFile Quaranatine fails with missing dirs in hadoop 2.0

Jonathan Hsieh created HBASE-6686:
-------------------------------------

             Summary: HFile Quaranatine fails with missing dirs in hadoop 2.0 
                 Key: HBASE-6686
                 URL: https://issues.apache.org/jira/browse/HBASE-6686
             Project: HBase
          Issue Type: Bug
          Components: hbck
    Affects Versions: 0.92.2, 0.96.0, 0.94.2
            Reporter: Jonathan Hsieh
             Fix For: 0.92.2, 0.96.0, 0.94.2


Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).

here's the exception:
{code}
2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
{code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Resolved] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh resolved HBASE-6686.
-----------------------------------

      Resolution: Fixed
    Hadoop Flags: Reviewed

Committed to 92/94/96 branches.  Thanks Lars and Jimmy for reviews.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Updated] (HBASE-6686) HFile Quaranatine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh updated HBASE-6686:
----------------------------------

    Attachment: hbase-6686-trunk.patch

simple fix.
                
> HFile Quaranatine fails with missing dirs in hadoop 2.0 
> --------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-trunk.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Updated] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh updated HBASE-6686:
----------------------------------

    Attachment: hbase-6686-94-92.patch

the 94-92 version passes against hadoop 1 and hadoop 2 profiles.  The equivalent is failing against trunk on a different test.  Investigating.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Updated] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh updated HBASE-6686:
----------------------------------

    Summary: HFile Quarantine fails with missing dirs in hadoop 2.0   (was: HFile Quaranatine fails with missing dirs in hadoop 2.0 )
    
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444347#comment-13444347 ] 

Jonathan Hsieh commented on HBASE-6686:
---------------------------------------

ack, first version worked on 0.92/0.94 but broke on trunk.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Hudson (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13448286#comment-13448286 ] 

Hudson commented on HBASE-6686:
-------------------------------

Integrated in HBase-0.94-security-on-Hadoop-23 #7 (See [https://builds.apache.org/job/HBase-0.94-security-on-Hadoop-23/7/])
    HBASE-6686 HFile Quarantine fails with missing dirs in hadoop 2.0 (Revision 1378763)

     Result = FAILURE
jmhsieh : 
Files : 
* /hbase/branches/0.94/src/main/java/org/apache/hadoop/hbase/util/hbck/HFileCorruptionChecker.java

                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Lars Hofhansl (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444499#comment-13444499 ] 

Lars Hofhansl commented on HBASE-6686:
--------------------------------------

0.92/0.94 patch looks good.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444412#comment-13444412 ] 

Jonathan Hsieh commented on HBASE-6686:
---------------------------------------

{code}
2012-08-29 12:55:26,031 ERROR [IPC Server handler 0 on 41070] security.UserGroupInformation(1235): PriviledgedActionException as:jon (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: /user/jon/hbase/testQuarantineMissingHFile/4332ea87d02d33e443550537722ff4fc/fam/befbe65ff30e4a46866f04a5671f0e44
2012-08-29 12:55:26,085 WARN  [Thread-2994] hbck.HFileCorruptionChecker(253): Failed to quaratine an HFile in regiondir hdfs://localhost:41070/user/jon/hbase/testQuarantineMissingHFile/4332ea87d02d33e443550537722ff4fc
java.lang.reflect.UndeclaredThrowableException
	at $Proxy23.getBlockLocations(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:882)
	at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:152)
	at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:119)
	at org.apache.hadoop.hdfs.DFSInputStream.&lt;init&gt;(DFSInputStream.java:112)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:955)
	at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:212)
	at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:75)
	at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:664)
	at org.apache.hadoop.hbase.io.hfile.HFile.createReaderWithEncoding(HFile.java:575)
	at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:605)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkHFile(HFileCorruptionChecker.java:94)
	at org.apache.hadoop.hbase.util.TestHBaseFsck$1$1.checkHFile(TestHBaseFsck.java:1401)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:175)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:208)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:290)
	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:281)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:261)
	... 27 more
Caused by: java.io.FileNotFoundException: File does not exist: /user/jon/hbase/testQuarantineMissingHFile/4332ea87d02d33e443550537722ff4fc/fam/befbe65ff30e4a46866f04a5671f0e44
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1133)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1095)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1067)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:384)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:165)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42586)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)

	at org.apache.hadoop.ipc.Client.call(Client.java:1161)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
	at $Proxy17.getBlockLocations(Unknown Source)
	at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
	at $Proxy17.getBlockLocations(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:149)
	... 31 more
{code}
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Updated] (HBASE-6686) HFile Quaranatine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh updated HBASE-6686:
----------------------------------

    Status: Patch Available  (was: Open)
    
> HFile Quaranatine fails with missing dirs in hadoop 2.0 
> --------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-trunk.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Hudson (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13448250#comment-13448250 ] 

Hudson commented on HBASE-6686:
-------------------------------

Integrated in HBase-0.94-security #51 (See [https://builds.apache.org/job/HBase-0.94-security/51/])
    HBASE-6686 HFile Quarantine fails with missing dirs in hadoop 2.0 (Revision 1378763)

     Result = FAILURE
jmhsieh : 
Files : 
* /hbase/branches/0.94/src/main/java/org/apache/hadoop/hbase/util/hbck/HFileCorruptionChecker.java

                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444498#comment-13444498 ] 

Jonathan Hsieh commented on HBASE-6686:
---------------------------------------

I've filed HBASE-6691 to deal with the remaining case missing file case on trunk.  

The current patch fixes the two failures on 92/94 and two of the three failures on 96/trunk.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jimmy Xiang (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444494#comment-13444494 ] 

Jimmy Xiang commented on HBASE-6686:
------------------------------------

+1 with the 92/94 patch.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444478#comment-13444478 ] 

Jonathan Hsieh commented on HBASE-6686:
---------------------------------------

Thanks colin.  The 0.92/0.94 patch I've attached addresses that portion.  Trunk hbase against hadoop 2 seems to have another problem -- 
I've been digging and hbase does some funky proxying to override the return value of a method and some functions are only present in trunk. Current though is that something in these 0.96/trunk changes aren't converting exceptions in the same way the normal client does it.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Hudson (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444731#comment-13444731 ] 

Hudson commented on HBASE-6686:
-------------------------------

Integrated in HBase-0.92-security #124 (See [https://builds.apache.org/job/HBase-0.92-security/124/])
    HBASE-6686 HFile Quarantine fails with missing dirs in hadoop 2.0 (Revision 1378764)

     Result = FAILURE
jmhsieh : 
Files : 
* /hbase/branches/0.92/CHANGES.txt
* /hbase/branches/0.92/src/main/java/org/apache/hadoop/hbase/util/hbck/HFileCorruptionChecker.java

                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Hudson (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13445637#comment-13445637 ] 

Hudson commented on HBASE-6686:
-------------------------------

Integrated in HBase-0.94 #443 (See [https://builds.apache.org/job/HBase-0.94/443/])
    HBASE-6686 HFile Quarantine fails with missing dirs in hadoop 2.0 (Revision 1378763)

     Result = SUCCESS
jmhsieh : 
Files : 
* /hbase/branches/0.94/src/main/java/org/apache/hadoop/hbase/util/hbck/HFileCorruptionChecker.java

                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Updated] (HBASE-6686) HFile Quaranatine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh updated HBASE-6686:
----------------------------------

    Status: Open  (was: Patch Available)
    
> HFile Quaranatine fails with missing dirs in hadoop 2.0 
> --------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Comment Edited] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444498#comment-13444498 ] 

Jonathan Hsieh edited comment on HBASE-6686 at 8/30/12 9:39 AM:
----------------------------------------------------------------

I've filed HBASE-6691 to deal with the remaining missing file case on trunk.  

The current patch fixes the two failures on 92/94 and two of the three failures on 96/trunk.
                
      was (Author: jmhsieh):
    I've filed HBASE-6691 to deal with the remaining case missing file case on trunk.  

The current patch fixes the two failures on 92/94 and two of the three failures on 96/trunk.
                  
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Colin Patrick McCabe (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444449#comment-13444449 ] 

Colin Patrick McCabe commented on HBASE-6686:
---------------------------------------------

As far as I can see, hadoop-1 returns null, not an empty list, when the directory does not exist.  Admittedly, I only checked DistributedFileSystem.java on line 279 (DistributedFileSystem#listStatus)  Maybe there's some other override that does it, but... seems questionable.

You're right that this is an exception in hadoop-2 / cdh4.
                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Hudson (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444814#comment-13444814 ] 

Hudson commented on HBASE-6686:
-------------------------------

Integrated in HBase-0.92 #545 (See [https://builds.apache.org/job/HBase-0.92/545/])
    HBASE-6686 HFile Quarantine fails with missing dirs in hadoop 2.0 (Revision 1378764)

     Result = FAILURE
jmhsieh : 
Files : 
* /hbase/branches/0.92/CHANGES.txt
* /hbase/branches/0.92/src/main/java/org/apache/hadoop/hbase/util/hbck/HFileCorruptionChecker.java

                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Updated] (HBASE-6686) HFile Quaranatine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh updated HBASE-6686:
----------------------------------

    Attachment:     (was: hbase-6686-trunk.patch)
    
> HFile Quaranatine fails with missing dirs in hadoop 2.0 
> --------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Assigned] (HBASE-6686) HFile Quaranatine fails with missing dirs in hadoop 2.0

Posted by "Jonathan Hsieh (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Hsieh reassigned HBASE-6686:
-------------------------------------

    Assignee: Jonathan Hsieh
    
> HFile Quaranatine fails with missing dirs in hadoop 2.0 
> --------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

[jira] [Commented] (HBASE-6686) HFile Quarantine fails with missing dirs in hadoop 2.0

Posted by "Hudson (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-6686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13444803#comment-13444803 ] 

Hudson commented on HBASE-6686:
-------------------------------

Integrated in HBase-TRUNK-on-Hadoop-2.0.0 #154 (See [https://builds.apache.org/job/HBase-TRUNK-on-Hadoop-2.0.0/154/])
    HBASE-6686 HFile Quarantine fails with missing dirs in hadoop 2.0 (Revision 1378762)

     Result = FAILURE
jmhsieh : 
Files : 
* /hbase/trunk/hbase-server/src/main/java/org/apache/hadoop/hbase/util/hbck/HFileCorruptionChecker.java

                
> HFile Quarantine fails with missing dirs in hadoop 2.0 
> -------------------------------------------------------
>
>                 Key: HBASE-6686
>                 URL: https://issues.apache.org/jira/browse/HBASE-6686
>             Project: HBase
>          Issue Type: Bug
>          Components: hbck
>    Affects Versions: 0.92.2, 0.96.0, 0.94.2
>            Reporter: Jonathan Hsieh
>            Assignee: Jonathan Hsieh
>             Fix For: 0.92.2, 0.96.0, 0.94.2
>
>         Attachments: hbase-6686-94-92.patch
>
>
> Two unit tests fail because listStatus's semantics change between hadoop 1.0 and hadoop 2.0.  (specifically -- hadoop 1.0 returns empty array if used on dir that does not exist, but hadoop 2.0 throws FileNotFoundException).
> here's the exception:
> {code}
> 2012-08-28 16:01:19,789 WARN  [Thread-3155] hbck.HFileCorruptionChecker(230): Failed to quaratine an HFile in regiondir hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669
> java.io.FileNotFoundException: File hdfs://localhost:38096/user/jenkins/hbase/testQuarantineMissingFamdir/34b2e072b33052bf4875f85513e9c669/fam does not exist.
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1341)
> 	at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1381)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkColFamDir(HFileCorruptionChecker.java:152)
> 	at org.apache.hadoop.hbase.util.TestHBaseFsck$2$1.checkColFamDir(TestHBaseFsck.java:1401)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker.checkRegionDir(HFileCorruptionChecker.java:185)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:267)
> 	at org.apache.hadoop.hbase.util.hbck.HFileCorruptionChecker$RegionDirChecker.call(HFileCorruptionChecker.java:258)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:98)
> 	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:206)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> 	at java.lang.Thread.run(Thread.java:662)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira