You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Daryn Sharp (JIRA)" <ji...@apache.org> on 2013/04/12 18:32:17 UTC

[jira] [Commented] (HADOOP-9474) fs -put command doesn't work if I selecting certain files from a local folder

    [ https://issues.apache.org/jira/browse/HADOOP-9474?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13630276#comment-13630276 ] 

Daryn Sharp commented on HADOOP-9474:
-------------------------------------

Someone should consider back-porting the post 1.x FsShell.  It fixes virtually all of the issues that are being reported in 1.x.  I would expect the new FsShell to practically be a drop-in replacement, although it's consistent behavior and posix compliance will introduce some incompatibilities.
                
> fs -put command doesn't work if I selecting certain files from a local folder
> -----------------------------------------------------------------------------
>
>                 Key: HADOOP-9474
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9474
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 1.1.2
>            Reporter: Glen Mazza
>
> The following four commands (a) - (d) were run sequentially.  From (a) - (c) HDFS folder "inputABC" does not yet exist.
> (a) and (b) are improperly refusing to put the files from conf/*.xml into inputABC because folder inputABC doesn't yet exist.  However, in (c) when I make the same request except with just "conf" (and not "conf/*.xml") HDFS will correctly create inputABC and copy the folders over.  We see that inputABC now exists in (d) when I subsequently try to copy the conf/*.xml folders, it complains that its files already exist there.
> IOW, I can put "conf" into a nonexisting HDFS folder and fs will create the folder for me, but I can't do the same with "conf/*.xml" -- but the latter should work equally as well.  The problem appears to be in org.apache.hadoop.fs.FileUtil, line 176, which properly routes "conf" to have its files copied but will have "conf/*.xml" subsequently return a "nonexisting folder" error.
> {noformat}
> a) gmazza@gmazza-work:/media/work1/hadoop-1.1.2$ bin/hadoop fs -put conf/*.xml inputABC
> put: `inputABC': specified destination directory doest not exist
> b) gmazza@gmazza-work:/media/work1/hadoop-1.1.2$ bin/hadoop fs -put conf/*.xml inputABC
> put: `inputABC': specified destination directory doest not exist
> c) gmazza@gmazza-work:/media/work1/hadoop-1.1.2$ bin/hadoop fs -put conf inputABC
> d) gmazza@gmazza-work:/media/work1/hadoop-1.1.2$ bin/hadoop fs -put conf/*.xml inputABC
> put: Target inputABC/capacity-scheduler.xml already exists
> Target inputABC/core-site.xml already exists
> Target inputABC/fair-scheduler.xml already exists
> Target inputABC/hadoop-policy.xml already exists
> Target inputABC/hdfs-site.xml already exists
> Target inputABC/mapred-queue-acls.xml already exists
> Target inputABC/mapred-site.xml already exists
> {noformat}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira