You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Anil Jagtap <an...@gmail.com> on 2014/12/27 23:19:58 UTC

Putting multiple files..

Dear All,

Just wanted to know if there is a way to copy multiple files using hadoop
fs -put.

Instead of specifying individual name I provide wild-chars and respective
files should get copied.

Thank You.

Rgds, Anil

Re: Putting multiple files..

Posted by hadoop hive <ha...@gmail.com>.
you can write small shell to do that :)

On Sun, Dec 28, 2014 at 3:49 AM, Anil Jagtap <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Re: Putting multiple files..

Posted by Chris Nauroth <cn...@hortonworks.com>.
Also, bash wildcard expansion should automatically put the full list of
matching files into the list of local source arguments prior to execution.
For example, assuming 3 files named hello1, hello2 and hello3, then running
the following command...

hdfs dfs -put hello* /user/chris

...should turn into this automatically...

hdfs dfs -put hello1 hello2 hello3 /user/chris

If that meets your need, then it avoids the need to spell out each local
source file name individually.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Sun, Jan 4, 2015 at 10:09 PM, "Cao Yi.曹铱" <ca...@perfect-cn.cn> wrote:

>  At the current version(2.6.0), *hadoop fs* is deprecated, use *hdfs dfs*
> instead.
>
> Ref:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html
>
> 在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
>  There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the
> destination filesystem. Also reads input from stdin and writes to
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which is
> similar to what you are demanding:-
>
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
>  Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:
>
>> Dear All,
>>
>>  Just wanted to know if there is a way to copy multiple files using
>> hadoop fs -put.
>>
>>  Instead of specifying individual name I provide wild-chars and
>> respective files should get copied.
>>
>>  Thank You.
>>
>>  Rgds, Anil
>>
>
> --
> Best Regards,
> Cao Yi, 曹铱
> Tel: 189-8052-8753
>
> 北京普菲特广告有限公司(成都)
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Putting multiple files..

Posted by Chris Nauroth <cn...@hortonworks.com>.
Also, bash wildcard expansion should automatically put the full list of
matching files into the list of local source arguments prior to execution.
For example, assuming 3 files named hello1, hello2 and hello3, then running
the following command...

hdfs dfs -put hello* /user/chris

...should turn into this automatically...

hdfs dfs -put hello1 hello2 hello3 /user/chris

If that meets your need, then it avoids the need to spell out each local
source file name individually.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Sun, Jan 4, 2015 at 10:09 PM, "Cao Yi.曹铱" <ca...@perfect-cn.cn> wrote:

>  At the current version(2.6.0), *hadoop fs* is deprecated, use *hdfs dfs*
> instead.
>
> Ref:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html
>
> 在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
>  There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the
> destination filesystem. Also reads input from stdin and writes to
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which is
> similar to what you are demanding:-
>
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
>  Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:
>
>> Dear All,
>>
>>  Just wanted to know if there is a way to copy multiple files using
>> hadoop fs -put.
>>
>>  Instead of specifying individual name I provide wild-chars and
>> respective files should get copied.
>>
>>  Thank You.
>>
>>  Rgds, Anil
>>
>
> --
> Best Regards,
> Cao Yi, 曹铱
> Tel: 189-8052-8753
>
> 北京普菲特广告有限公司(成都)
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Putting multiple files..

Posted by Chris Nauroth <cn...@hortonworks.com>.
Also, bash wildcard expansion should automatically put the full list of
matching files into the list of local source arguments prior to execution.
For example, assuming 3 files named hello1, hello2 and hello3, then running
the following command...

hdfs dfs -put hello* /user/chris

...should turn into this automatically...

hdfs dfs -put hello1 hello2 hello3 /user/chris

If that meets your need, then it avoids the need to spell out each local
source file name individually.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Sun, Jan 4, 2015 at 10:09 PM, "Cao Yi.曹铱" <ca...@perfect-cn.cn> wrote:

>  At the current version(2.6.0), *hadoop fs* is deprecated, use *hdfs dfs*
> instead.
>
> Ref:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html
>
> 在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
>  There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the
> destination filesystem. Also reads input from stdin and writes to
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which is
> similar to what you are demanding:-
>
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
>  Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:
>
>> Dear All,
>>
>>  Just wanted to know if there is a way to copy multiple files using
>> hadoop fs -put.
>>
>>  Instead of specifying individual name I provide wild-chars and
>> respective files should get copied.
>>
>>  Thank You.
>>
>>  Rgds, Anil
>>
>
> --
> Best Regards,
> Cao Yi, 曹铱
> Tel: 189-8052-8753
>
> 北京普菲特广告有限公司(成都)
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Putting multiple files..

Posted by Chris Nauroth <cn...@hortonworks.com>.
Also, bash wildcard expansion should automatically put the full list of
matching files into the list of local source arguments prior to execution.
For example, assuming 3 files named hello1, hello2 and hello3, then running
the following command...

hdfs dfs -put hello* /user/chris

...should turn into this automatically...

hdfs dfs -put hello1 hello2 hello3 /user/chris

If that meets your need, then it avoids the need to spell out each local
source file name individually.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Sun, Jan 4, 2015 at 10:09 PM, "Cao Yi.曹铱" <ca...@perfect-cn.cn> wrote:

>  At the current version(2.6.0), *hadoop fs* is deprecated, use *hdfs dfs*
> instead.
>
> Ref:
>
> http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html
>
> 在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
>  There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the
> destination filesystem. Also reads input from stdin and writes to
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which is
> similar to what you are demanding:-
>
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
>  Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:
>
>> Dear All,
>>
>>  Just wanted to know if there is a way to copy multiple files using
>> hadoop fs -put.
>>
>>  Instead of specifying individual name I provide wild-chars and
>> respective files should get copied.
>>
>>  Thank You.
>>
>>  Rgds, Anil
>>
>
> --
> Best Regards,
> Cao Yi, 曹铱
> Tel: 189-8052-8753
>
> 北京普菲特广告有限公司(成都)
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Putting multiple files..

Posted by "Cao Yi.曹铱" <ca...@perfect-cn.cn>.
At the current version(2.6.0), /hadoop fs/ is deprecated, use /hdfs dfs/ 
instead.

Ref:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html

在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
> There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the 
> destination filesystem. Also reads input from stdin and writes to 
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which 
> is similar to what you are demanding:-
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
> Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
>
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <anil.jagtap@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Dear All,
>
>     Just wanted to know if there is a way to copy multiple files using
>     hadoop fs -put.
>
>     Instead of specifying individual name I provide wild-chars and
>     respective files should get copied.
>
>     Thank You.
>
>     Rgds, Anil
>

-- 
Best Regards,
Cao Yi, 曹铱
Tel: 189-8052-8753

北京普菲特广告有限公司(成都)


Re: Putting multiple files..

Posted by "Cao Yi.曹铱" <ca...@perfect-cn.cn>.
At the current version(2.6.0), /hadoop fs/ is deprecated, use /hdfs dfs/ 
instead.

Ref:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html

在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
> There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the 
> destination filesystem. Also reads input from stdin and writes to 
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which 
> is similar to what you are demanding:-
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
> Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
>
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <anil.jagtap@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Dear All,
>
>     Just wanted to know if there is a way to copy multiple files using
>     hadoop fs -put.
>
>     Instead of specifying individual name I provide wild-chars and
>     respective files should get copied.
>
>     Thank You.
>
>     Rgds, Anil
>

-- 
Best Regards,
Cao Yi, 曹铱
Tel: 189-8052-8753

北京普菲特广告有限公司(成都)


Re: Putting multiple files..

Posted by "Cao Yi.曹铱" <ca...@perfect-cn.cn>.
At the current version(2.6.0), /hadoop fs/ is deprecated, use /hdfs dfs/ 
instead.

Ref:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html

在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
> There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the 
> destination filesystem. Also reads input from stdin and writes to 
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which 
> is similar to what you are demanding:-
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
> Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
>
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <anil.jagtap@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Dear All,
>
>     Just wanted to know if there is a way to copy multiple files using
>     hadoop fs -put.
>
>     Instead of specifying individual name I provide wild-chars and
>     respective files should get copied.
>
>     Thank You.
>
>     Rgds, Anil
>

-- 
Best Regards,
Cao Yi, 曹铱
Tel: 189-8052-8753

北京普菲特广告有限公司(成都)


Re: Putting multiple files..

Posted by "Cao Yi.曹铱" <ca...@perfect-cn.cn>.
At the current version(2.6.0), /hadoop fs/ is deprecated, use /hdfs dfs/ 
instead.

Ref:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/CommandsManual.html

在 2014年12月29日 05:26, Abhishek Singh 写道:
>
> Hello Anil,
>
> There are 2 ways I'm aware of :-
>
> 1) use put command
>
> put
>
> Usage: hadoop fs -put <localsrc> ... <dst>
>
> Copy single src, or multiple srcs from local file system to the 
> destination filesystem. Also reads input from stdin and writes to 
> destination filesystem.
>
>     hadoop fs -put localfile /user/hadoop/hadoopfile
>     hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
>     hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile 
> <http://nn.example.com/hadoop/hadoopfile>
>     Reads the input from stdin.
>
> Exit Code:
>
> Returns 0 on success and -1 on error.
>
> 2) Create a shell script for your custom need.
>
> To give you a vague idea here's one of the link on stackoverflow which 
> is similar to what you are demanding:-
>
> http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster
>
> Please reach out for further discussion!
>
> Thanks!
>
> Regards,
>
> Abhishek Singh
>
> On Dec 28, 2014 3:52 AM, "Anil Jagtap" <anil.jagtap@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Dear All,
>
>     Just wanted to know if there is a way to copy multiple files using
>     hadoop fs -put.
>
>     Instead of specifying individual name I provide wild-chars and
>     respective files should get copied.
>
>     Thank You.
>
>     Rgds, Anil
>

-- 
Best Regards,
Cao Yi, 曹铱
Tel: 189-8052-8753

北京普菲特广告有限公司(成都)


Re: Putting multiple files..

Posted by Abhishek Singh <23...@gmail.com>.
Hello Anil,

There are 2 ways I'm aware of :-

1) use put command

put

Usage: hadoop fs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination
filesystem. Also reads input from stdin and writes to destination
filesystem.

    hadoop fs -put localfile /user/hadoop/hadoopfile
    hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
    hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
    hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
    Reads the input from stdin.

Exit Code:

Returns 0 on success and -1 on error.

2) Create a shell script for your custom need.

To give you a vague idea here's one of the link on stackoverflow which is
similar to what you are demanding:-

http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster

Please reach out for further discussion!

Thanks!

Regards,

Abhishek Singh
On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Re: Putting multiple files..

Posted by hadoop hive <ha...@gmail.com>.
you can write small shell to do that :)

On Sun, Dec 28, 2014 at 3:49 AM, Anil Jagtap <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Re: Putting multiple files..

Posted by Abhishek Singh <23...@gmail.com>.
Hello Anil,

There are 2 ways I'm aware of :-

1) use put command

put

Usage: hadoop fs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination
filesystem. Also reads input from stdin and writes to destination
filesystem.

    hadoop fs -put localfile /user/hadoop/hadoopfile
    hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
    hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
    hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
    Reads the input from stdin.

Exit Code:

Returns 0 on success and -1 on error.

2) Create a shell script for your custom need.

To give you a vague idea here's one of the link on stackoverflow which is
similar to what you are demanding:-

http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster

Please reach out for further discussion!

Thanks!

Regards,

Abhishek Singh
On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Re: Putting multiple files..

Posted by Abhishek Singh <23...@gmail.com>.
Hello Anil,

There are 2 ways I'm aware of :-

1) use put command

put

Usage: hadoop fs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination
filesystem. Also reads input from stdin and writes to destination
filesystem.

    hadoop fs -put localfile /user/hadoop/hadoopfile
    hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
    hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
    hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
    Reads the input from stdin.

Exit Code:

Returns 0 on success and -1 on error.

2) Create a shell script for your custom need.

To give you a vague idea here's one of the link on stackoverflow which is
similar to what you are demanding:-

http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster

Please reach out for further discussion!

Thanks!

Regards,

Abhishek Singh
On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Re: Putting multiple files..

Posted by hadoop hive <ha...@gmail.com>.
you can write small shell to do that :)

On Sun, Dec 28, 2014 at 3:49 AM, Anil Jagtap <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Re: Putting multiple files..

Posted by hadoop hive <ha...@gmail.com>.
you can write small shell to do that :)

On Sun, Dec 28, 2014 at 3:49 AM, Anil Jagtap <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Re: Putting multiple files..

Posted by Abhishek Singh <23...@gmail.com>.
Hello Anil,

There are 2 ways I'm aware of :-

1) use put command

put

Usage: hadoop fs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination
filesystem. Also reads input from stdin and writes to destination
filesystem.

    hadoop fs -put localfile /user/hadoop/hadoopfile
    hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
    hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
    hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
    Reads the input from stdin.

Exit Code:

Returns 0 on success and -1 on error.

2) Create a shell script for your custom need.

To give you a vague idea here's one of the link on stackoverflow which is
similar to what you are demanding:-

http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster

Please reach out for further discussion!

Thanks!

Regards,

Abhishek Singh
On Dec 28, 2014 3:52 AM, "Anil Jagtap" <an...@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>