You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Hongyuan Li (JIRA)" <ji...@apache.org> on 2017/06/10 04:11:21 UTC

[jira] [Updated] (HADOOP-14469) FTPFileSystem#listStatus get currentPath and parentPath, which will cause recursively list stuck

     [ https://issues.apache.org/jira/browse/HADOOP-14469?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hongyuan Li updated HADOOP-14469:
---------------------------------
    Summary: FTPFileSystem#listStatus get currentPath and parentPath, which will cause recursively list stuck  (was: FTPFileSystem#listStatus get currentPath and parentPath, which is wrong)

> FTPFileSystem#listStatus get currentPath and parentPath, which will cause recursively list stuck
> ------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-14469
>                 URL: https://issues.apache.org/jira/browse/HADOOP-14469
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 3.0.0-alpha3
>         Environment: ftp build by windows7 + Serv-U_6412.1.0.8 
> code runs any os
>            Reporter: Hongyuan Li
>            Assignee: Hongyuan Li
>         Attachments: HADOOP-14469-001.patch, HADOOP-14469-002.patch, HADOOP-14469-003.patch
>
>
> for some ftpsystems, liststatus method will return new Path(".") and new Path(".."), thus causing list op looping.for example, Serv-U
> We can see the logic in code below:
> {code}
>   private FileStatus[] listStatus(FTPClient client, Path file)
>       throws IOException {
>     ……
>     FileStatus[] fileStats = new FileStatus[ftpFiles.length];
>     for (int i = 0; i < ftpFiles.length; i++) {
>       fileStats[i] = getFileStatus(ftpFiles[i], absolute);
>     }
>     return fileStats;
>   }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org