You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by S D <sd...@gmail.com> on 2009/02/03 23:29:36 UTC

Hadoop FS Shell - command overwrite capability

I'm using the Hadoop FS commands to move files from my local machine into
the Hadoop dfs. I'd like a way to force a write to the dfs even if a file of
the same name exists. Ideally I'd like to use a "-force" switch or some
such; e.g.,
    hadoop dfs -copyFromLocal -force adirectory s3n://wholeinthebucket/

Is there a way to do this or does anyone know if this is in the future
Hadoop plans?

Thanks
John SD

Re: Hadoop FS Shell - command overwrite capability

Posted by S D <sd...@gmail.com>.
Rasit,

Thanks for this comment. I do need console-based control and will consider
your suggestion of using a jar file.

Thanks,
John

On Wed, Feb 4, 2009 at 10:17 AM, Rasit OZDAS <ra...@gmail.com> wrote:

> John, I also couldn't find a way from console,
> Maybe you already know and don't prefer to use, but API solves this
> problem.
> FileSystem.copyFromLocalFile(boolean delSrc, boolean overwrite, Path
> src, Path dst)
>
> If you have to use console, long solution, but you can create a jar
> for this, and call it just like hadoop calls FileSystem class in
> "hadoop" file in bin directory.
>
> I think File System API also needs some improvement. I wonder if it's
> considered by head developers.
>
> Hope this helps,
> Rasit
>
> 2009/2/4 S D <sd...@gmail.com>:
> > I'm using the Hadoop FS commands to move files from my local machine into
> > the Hadoop dfs. I'd like a way to force a write to the dfs even if a file
> of
> > the same name exists. Ideally I'd like to use a "-force" switch or some
> > such; e.g.,
> >    hadoop dfs -copyFromLocal -force adirectory s3n://wholeinthebucket/
> >
> > Is there a way to do this or does anyone know if this is in the future
> > Hadoop plans?
> >
> > Thanks
> > John SD
> >
>
>
>
> --
> M. Raşit ÖZDAŞ
>

Re: Hadoop FS Shell - command overwrite capability

Posted by Rasit OZDAS <ra...@gmail.com>.
John, I also couldn't find a way from console,
Maybe you already know and don't prefer to use, but API solves this problem.
FileSystem.copyFromLocalFile(boolean delSrc, boolean overwrite, Path
src, Path dst)

If you have to use console, long solution, but you can create a jar
for this, and call it just like hadoop calls FileSystem class in
"hadoop" file in bin directory.

I think File System API also needs some improvement. I wonder if it's
considered by head developers.

Hope this helps,
Rasit

2009/2/4 S D <sd...@gmail.com>:
> I'm using the Hadoop FS commands to move files from my local machine into
> the Hadoop dfs. I'd like a way to force a write to the dfs even if a file of
> the same name exists. Ideally I'd like to use a "-force" switch or some
> such; e.g.,
>    hadoop dfs -copyFromLocal -force adirectory s3n://wholeinthebucket/
>
> Is there a way to do this or does anyone know if this is in the future
> Hadoop plans?
>
> Thanks
> John SD
>



-- 
M. Raşit ÖZDAŞ