You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Hairong Kuang (JIRA)" <ji...@apache.org> on 2010/08/02 19:26:21 UTC
[jira] Reopened: (HADOOP-6890) Improve listFiles API introduced by
HADOOP-6870
[ https://issues.apache.org/jira/browse/HADOOP-6890?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hairong Kuang reopened HADOOP-6890:
-----------------------------------
> Improve listFiles API introduced by HADOOP-6870
> -----------------------------------------------
>
> Key: HADOOP-6890
> URL: https://issues.apache.org/jira/browse/HADOOP-6890
> Project: Hadoop Common
> Issue Type: Improvement
> Components: fs
> Affects Versions: 0.22.0
> Reporter: Hairong Kuang
> Assignee: Hairong Kuang
> Fix For: 0.22.0
>
> Attachments: improveListFiles.patch, listFilesInFS.patch
>
>
> This jira is mainly for addressing Suresh's review comments for HADOOP-6870:
> 1. General comment: I have concerns about recursive listing. This could be abused by the applications, creating a lot of requests into HDFS.
> 2. Any deletion of files/directories while reursing through directories results in RuntimeException and application has a partial result. Should we ignore if a directory was in stack and was not found later when iterating through it?
> 3. FileSystem.java
> * listFile() - method javadoc could be better organized - first write about if path is directory and two cases recursive=true and false. Then if path is file and two cases recursive=true or false.
> * listFile() - document throwing RuntimeException, UnsupportedOperationException and the possible cause. IOException is no longer thrown.
> 4. TestListFiles.java
> * testDirectory() - comments test empty directory and test directory with 1 file should be moved up to relevant sections of the test.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.