You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Tao <zt...@outlook.com> on 2012/08/28 17:36:08 UTC

distcp error.

Hi, all

         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.

         When the file path(or file name) contain Chinese character, an
exception will throw. Like below. I need some help about this.

         Thanks.

         

 

 

 

[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
/tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试
hdfs://10.xx.xx.bb:54310/tmp/distcp_test14

12/08/28 23:32:31 INFO tools.DistCp: Input Options:
DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
copyStrategy='uniformsize', sourceFileListing=null,
sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试],
targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}

12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log

12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
Instead, use mapreduce.task.io.sort.mb

12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
Instead, use mapreduce.task.io.sort.factor

12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1

12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
Instead, use mapreduce.job.jar

12/08/28 23:32:36 WARN conf.Configuration:
mapred.map.tasks.speculative.execution is deprecated. Instead, use
mapreduce.map.speculative

12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
deprecated. Instead, use mapreduce.map.output.value.class

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
deprecated. Instead, use mapreduce.job.map.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
Instead, use mapreduce.job.name

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
deprecated. Instead, use mapreduce.job.inputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated.
Instead, use mapreduce.output.fileoutputformat.outputdir

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
deprecated. Instead, use mapreduce.job.outputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
Instead, use mapreduce.job.maps

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
deprecated. Instead, use mapreduce.map.output.key.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated.
Instead, use mapreduce.job.working.dir

12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040

12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
http://baby20:8088/proxy/application_1345831938927_0039/

12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039

12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039

12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in
uber mode : false

12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0039_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/中
文路径测试/part-r-00017 -->
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

 

 


Re: distcp error.

Posted by 심병렬 <si...@gmail.com>.
Umsubscribe
2012. 8. 29. 오전 12:44에 "Tao" <zt...@outlook.com>님이 작성:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>

Re: distcp error.

Posted by 심병렬 <si...@gmail.com>.
Umsubscribe
2012. 8. 29. 오전 12:44에 "Tao" <zt...@outlook.com>님이 작성:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>

Re: distcp error.

Posted by Dan Young <da...@gmail.com>.
I was just reading about this in the Hadoop definitive guide last night.
Need to be the same version. You can try hftp between versions

Regards

Dano
On Aug 28, 2012 9:44 AM, "Tao" <zt...@outlook.com> wrote:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>

Re: distcp error.

Posted by 심병렬 <si...@gmail.com>.
Umsubscribe
2012. 8. 29. 오전 12:44에 "Tao" <zt...@outlook.com>님이 작성:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>

Re: distcp error.

Posted by Daryn Sharp <da...@yahoo-inc.com>.
Try taking a look at your NN logs to see why it had an internal server error.  I believe the servlets are hardcoded to decode the path as UTF-8, but maybe the client used a different encoding.

Daryn

On Aug 28, 2012, at 11:30 AM, Tao wrote:

Hi,
         Thanks for your reply.
         I have tried between 1.0.3s and 2.0.1s.
         Both are failed.

         Path contain Chinese character.
         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.
                                     12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203
12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%
12/08/29 00:24:46 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_0, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:04 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_1, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:19 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_2, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203
12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1
12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201208101345_2203_m_000000
With failures, global counters are inaccurate; consider running with -i
Copy failed: java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)


         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.
12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in uber mode : false
12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%
12/08/29 00:20:23 INFO mapreduce.Job: Task Id : attempt_1345831938927_0043_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://baby20:50070/tmp/<hftp://baby20:50070/tmp/中文.log/add.csv>中文<hftp://baby20:50070/tmp/中文.log/add.csv>.log/add.csv<hftp://baby20:50070/tmp/中文.log/add.csv> tohdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr



发件人: Marcos Ortiz [mailto:mlortiz@uci.cu]
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org<ma...@hadoop.apache.org>
抄送: Tao
主题: Re: distcp error.

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?
El 28/08/2012 11:36, Tao escribió:
Hi, all
         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
         When the file path(or file name) contain Chinese character, an exception will throw. Like below. I need some help about this.
         Thanks.




[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>hdfs://10.xx.xx.bb:54310/tmp/distcp_test14
12/08/28 23:32:31 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=true, maxMaps=14, sslConfigurationFile='null', copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>]<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>, targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb
12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated. Instead, use mapreduce.task.io.sort.factor
12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job: http://baby20:8088/proxy/application_1345831938927_0039/
12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in uber mode : false
12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job: Task Id : attempt_1345831938927_0039_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> -->hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> tohdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more




<~WRD000.jpg>
<http://www.uci.cu/>


<~WRD000.jpg><http://www.uci.cu/>



Re: distcp error.

Posted by Daryn Sharp <da...@yahoo-inc.com>.
Try taking a look at your NN logs to see why it had an internal server error.  I believe the servlets are hardcoded to decode the path as UTF-8, but maybe the client used a different encoding.

Daryn

On Aug 28, 2012, at 11:30 AM, Tao wrote:

Hi,
         Thanks for your reply.
         I have tried between 1.0.3s and 2.0.1s.
         Both are failed.

         Path contain Chinese character.
         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.
                                     12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203
12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%
12/08/29 00:24:46 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_0, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:04 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_1, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:19 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_2, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203
12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1
12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201208101345_2203_m_000000
With failures, global counters are inaccurate; consider running with -i
Copy failed: java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)


         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.
12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in uber mode : false
12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%
12/08/29 00:20:23 INFO mapreduce.Job: Task Id : attempt_1345831938927_0043_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://baby20:50070/tmp/<hftp://baby20:50070/tmp/中文.log/add.csv>中文<hftp://baby20:50070/tmp/中文.log/add.csv>.log/add.csv<hftp://baby20:50070/tmp/中文.log/add.csv> tohdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr



发件人: Marcos Ortiz [mailto:mlortiz@uci.cu]
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org<ma...@hadoop.apache.org>
抄送: Tao
主题: Re: distcp error.

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?
El 28/08/2012 11:36, Tao escribió:
Hi, all
         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
         When the file path(or file name) contain Chinese character, an exception will throw. Like below. I need some help about this.
         Thanks.




[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>hdfs://10.xx.xx.bb:54310/tmp/distcp_test14
12/08/28 23:32:31 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=true, maxMaps=14, sslConfigurationFile='null', copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>]<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>, targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb
12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated. Instead, use mapreduce.task.io.sort.factor
12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job: http://baby20:8088/proxy/application_1345831938927_0039/
12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in uber mode : false
12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job: Task Id : attempt_1345831938927_0039_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> -->hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> tohdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more




<~WRD000.jpg>
<http://www.uci.cu/>


<~WRD000.jpg><http://www.uci.cu/>



Re: 答复: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Have you searched in the Hadoop Jira if there is a seemed problem to this?
This can be a bug.

El 28/08/2012 12:30, Tao escribió:
>
> Hi,
>
> Thanks for your reply.
>
> I have tried between 1.0.3s and 2.0.1s.
>
> Both are failed.
>
> Path contain Chinese character.
>
> *1.0.3 hftp to 1.0.3 hdfs, exception inform is below.*
>
> 12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
>
> 12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
>
> 12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
>
> 12/08/29 00:24:24 INFO mapred.JobClient: Running job:
> job_201208101345_2203
>
> 12/08/29 00:24:25 INFO mapred.JobClient: map 0% reduce 0%
>
> 12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_0, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_1, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_2, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job complete:
> job_201208101345_2203
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Counters
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=66844
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all
> reduces waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all maps
> waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Launched map tasks=4
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Failed map tasks=1
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map
> Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask:
> task_201208101345_2203_m_000000
>
> With failures, global counters are inaccurate; consider running with -i
>
> Copy failed: java.io.IOException: Job failed!
>
> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
>
> at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
>
> at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>
> at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)
>
> *2.0.1 hftp to 2.0.1 hdfs, exception inform is below.*
>
> 12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
>
> 12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
>
> 12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043
> running in uber mode : false
>
> 12/08/29 00:20:14 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0043_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed:
> hftp://baby20:50070/tmp/??.log/add.csv -->
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://baby20:50070/tmp/中文.log/add.csv to
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr
>
> *发件人:*Marcos Ortiz [mailto:mlortiz@uci.cu]
> *发 送时间:*2012年8月28日23:53
> *收件人:*user@hadoop.apache.org
> *抄送:*Tao
> *主题:*Re: distcp error.
>
> Hi, Tao. This problem is only with 2.0.1 or with the two versions?
> Have you tried to use distcp from 1.0.3 to 1.0.3?
>
> El 28/08/2012 11:36, Tao escribió:
>
>     Hi, all
>
>     I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
>     When the file path(or file name) contain Chinese character, an
>     exception will throw. Like below. I need some help about this.
>
>     Thanks.
>
>     [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
>     /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
>     //10.xx.xx.bb:54310/tmp/distcp_test14
>
>     12/08/28 23:32:31 INFO tools.DistCp: Input Options:
>     DistCpOptions{atomicCommit=false, syncFolder=false,
>     deleteMissing=false, ignoreFailures=true, maxMaps=14,
>     sslConfigurationFile='null', copyStrategy='uniformsize',
>     sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp
>     /中文路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
>     12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path:
>     /tmp/distcp.log
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is
>     deprecated. Instead, use mapreduce.task.io.sort.mb
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
>     deprecated. Instead, use mapreduce.task.io.sort.factor
>
>     12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
>     native-hadoop library for your platform... using builtin-java
>     classes where applicable
>
>     12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is
>     deprecated. Instead, use mapreduce.job.jar
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.map.tasks.speculative.execution is deprecated. Instead, use
>     mapreduce.map.speculative
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
>     deprecated. Instead, use mapreduce.job.reduces
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.value.class is deprecated. Instead, use
>     mapreduce.map.output.value.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
>     deprecated. Instead, use mapreduce.job.map.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
>     deprecated. Instead, use mapreduce.job.name
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.inputformat.class is deprecated. Instead, use
>     mapreduce.job.inputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
>     deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.outputformat.class is deprecated. Instead, use
>     mapreduce.job.outputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
>     deprecated. Instead, use mapreduce.job.maps
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.key.class is deprecated. Instead, use
>     mapreduce.map.output.key.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
>     deprecated. Instead, use mapreduce.job.working.dir
>
>     12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
>     application application_1345831938927_0039 to ResourceManager at
>     baby20/10.1.1.40:8040
>
>     12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
>     http://baby20:8088/proxy/application_1345831938927_0039/
>
>     12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id:
>     job_1345831938927_0039
>
>     12/08/28 23:32:37 INFO mapreduce.Job: Running job:
>     job_1345831938927_0039
>
>     12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
>     running in uber mode : false
>
>     12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
>     attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
>     Error: java.io.IOException: File copy failed:
>     hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 -->
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
>     at java.security.AccessController.doPrivileged(Native Method)
>
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>
>     at
>     org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
>     Caused by: java.io.IOException: Couldn't run retriable-command:
>     Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
>     ... 10 more
>
>     Caused by:
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
>     java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
>     ... 11 more
>
>     Caused by: java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
>     at java.io.DataInputStream.read(DataInputStream.java:132)
>
>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
>     at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
>     at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
>     at
>     org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
>     ... 16 more
>
>
>     说明: 图像已被发件人删除。
>     <http://www.uci.cu/>
>
>
>
> 说明: 图像已被发件人删除。 <http://www.uci.cu/>
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: distcp error.

Posted by Daryn Sharp <da...@yahoo-inc.com>.
Try taking a look at your NN logs to see why it had an internal server error.  I believe the servlets are hardcoded to decode the path as UTF-8, but maybe the client used a different encoding.

Daryn

On Aug 28, 2012, at 11:30 AM, Tao wrote:

Hi,
         Thanks for your reply.
         I have tried between 1.0.3s and 2.0.1s.
         Both are failed.

         Path contain Chinese character.
         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.
                                     12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203
12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%
12/08/29 00:24:46 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_0, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:04 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_1, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:19 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_2, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203
12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1
12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201208101345_2203_m_000000
With failures, global counters are inaccurate; consider running with -i
Copy failed: java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)


         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.
12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in uber mode : false
12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%
12/08/29 00:20:23 INFO mapreduce.Job: Task Id : attempt_1345831938927_0043_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://baby20:50070/tmp/<hftp://baby20:50070/tmp/中文.log/add.csv>中文<hftp://baby20:50070/tmp/中文.log/add.csv>.log/add.csv<hftp://baby20:50070/tmp/中文.log/add.csv> tohdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr



发件人: Marcos Ortiz [mailto:mlortiz@uci.cu]
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org<ma...@hadoop.apache.org>
抄送: Tao
主题: Re: distcp error.

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?
El 28/08/2012 11:36, Tao escribió:
Hi, all
         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
         When the file path(or file name) contain Chinese character, an exception will throw. Like below. I need some help about this.
         Thanks.




[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>hdfs://10.xx.xx.bb:54310/tmp/distcp_test14
12/08/28 23:32:31 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=true, maxMaps=14, sslConfigurationFile='null', copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>]<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>, targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb
12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated. Instead, use mapreduce.task.io.sort.factor
12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job: http://baby20:8088/proxy/application_1345831938927_0039/
12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in uber mode : false
12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job: Task Id : attempt_1345831938927_0039_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> -->hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> tohdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more




<~WRD000.jpg>
<http://www.uci.cu/>


<~WRD000.jpg><http://www.uci.cu/>



Re: distcp error.

Posted by Daryn Sharp <da...@yahoo-inc.com>.
Try taking a look at your NN logs to see why it had an internal server error.  I believe the servlets are hardcoded to decode the path as UTF-8, but maybe the client used a different encoding.

Daryn

On Aug 28, 2012, at 11:30 AM, Tao wrote:

Hi,
         Thanks for your reply.
         I have tried between 1.0.3s and 2.0.1s.
         Both are failed.

         Path contain Chinese character.
         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.
                                     12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203
12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%
12/08/29 00:24:46 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_0, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:04 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_1, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:19 INFO mapred.JobClient: Task Id : attempt_201208101345_2203_m_000000_2, Status : FAILED
java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
        at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203
12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4
12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1
12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201208101345_2203_m_000000
With failures, global counters are inaccurate; consider running with -i
Copy failed: java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)


         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.
12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in uber mode : false
12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%
12/08/29 00:20:23 INFO mapreduce.Job: Task Id : attempt_1345831938927_0043_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://baby20:50070/tmp/<hftp://baby20:50070/tmp/中文.log/add.csv>中文<hftp://baby20:50070/tmp/中文.log/add.csv>.log/add.csv<hftp://baby20:50070/tmp/中文.log/add.csv> tohdfs://baby20:54310/tmp4/add.csv
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 400
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server returned HTTP response code: 400 for URL:http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr



发件人: Marcos Ortiz [mailto:mlortiz@uci.cu]
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org<ma...@hadoop.apache.org>
抄送: Tao
主题: Re: distcp error.

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?
El 28/08/2012 11:36, Tao escribió:
Hi, all
         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
         When the file path(or file name) contain Chinese character, an exception will throw. Like below. I need some help about this.
         Thanks.




[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试>hdfs://10.xx.xx.bb:54310/tmp/distcp_test14
12/08/28 23:32:31 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=true, maxMaps=14, sslConfigurationFile='null', copyStrategy='uniformsize', sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>中文路径测试<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>]<hftp://10.xx.xx.aa:50070/tmp/中文路径测试]>, targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated. Instead, use mapreduce.task.io.sort.mb
12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated. Instead, use mapreduce.task.io.sort.factor
12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is deprecated. Instead, use mapreduce.map.output.value.class
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is deprecated. Instead, use mapreduce.job.inputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is deprecated. Instead, use mapreduce.job.outputformat.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is deprecated. Instead, use mapreduce.map.output.key.class
12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job: http://baby20:8088/proxy/application_1345831938927_0039/
12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in uber mode : false
12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%
12/08/28 23:33:00 INFO mapreduce.Job: Task Id : attempt_1345831938927_0039_m_000000_0, Status : FAILED
Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> -->hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.io.IOException: Couldn't run retriable-command: Copying hftp://10.1.1.26:50070/tmp/<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>中文路径测试<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017>/part-r-00017<hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017> tohdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
        at org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
        ... 10 more
Caused by: org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
        at org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
        ... 11 more
Caused by: java.io.IOException: HTTP_OK expected, received 500
        at org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
        at org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
        at java.io.DataInputStream.read(DataInputStream.java:132)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
        at java.io.FilterInputStream.read(FilterInputStream.java:90)
        at org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
        at org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
        ... 16 more




<~WRD000.jpg>
<http://www.uci.cu/>


<~WRD000.jpg><http://www.uci.cu/>



Re: 答复: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Have you searched in the Hadoop Jira if there is a seemed problem to this?
This can be a bug.

El 28/08/2012 12:30, Tao escribió:
>
> Hi,
>
> Thanks for your reply.
>
> I have tried between 1.0.3s and 2.0.1s.
>
> Both are failed.
>
> Path contain Chinese character.
>
> *1.0.3 hftp to 1.0.3 hdfs, exception inform is below.*
>
> 12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
>
> 12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
>
> 12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
>
> 12/08/29 00:24:24 INFO mapred.JobClient: Running job:
> job_201208101345_2203
>
> 12/08/29 00:24:25 INFO mapred.JobClient: map 0% reduce 0%
>
> 12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_0, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_1, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_2, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job complete:
> job_201208101345_2203
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Counters
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=66844
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all
> reduces waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all maps
> waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Launched map tasks=4
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Failed map tasks=1
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map
> Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask:
> task_201208101345_2203_m_000000
>
> With failures, global counters are inaccurate; consider running with -i
>
> Copy failed: java.io.IOException: Job failed!
>
> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
>
> at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
>
> at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>
> at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)
>
> *2.0.1 hftp to 2.0.1 hdfs, exception inform is below.*
>
> 12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
>
> 12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
>
> 12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043
> running in uber mode : false
>
> 12/08/29 00:20:14 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0043_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed:
> hftp://baby20:50070/tmp/??.log/add.csv -->
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://baby20:50070/tmp/中文.log/add.csv to
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr
>
> *发件人:*Marcos Ortiz [mailto:mlortiz@uci.cu]
> *发 送时间:*2012年8月28日23:53
> *收件人:*user@hadoop.apache.org
> *抄送:*Tao
> *主题:*Re: distcp error.
>
> Hi, Tao. This problem is only with 2.0.1 or with the two versions?
> Have you tried to use distcp from 1.0.3 to 1.0.3?
>
> El 28/08/2012 11:36, Tao escribió:
>
>     Hi, all
>
>     I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
>     When the file path(or file name) contain Chinese character, an
>     exception will throw. Like below. I need some help about this.
>
>     Thanks.
>
>     [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
>     /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
>     //10.xx.xx.bb:54310/tmp/distcp_test14
>
>     12/08/28 23:32:31 INFO tools.DistCp: Input Options:
>     DistCpOptions{atomicCommit=false, syncFolder=false,
>     deleteMissing=false, ignoreFailures=true, maxMaps=14,
>     sslConfigurationFile='null', copyStrategy='uniformsize',
>     sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp
>     /中文路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
>     12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path:
>     /tmp/distcp.log
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is
>     deprecated. Instead, use mapreduce.task.io.sort.mb
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
>     deprecated. Instead, use mapreduce.task.io.sort.factor
>
>     12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
>     native-hadoop library for your platform... using builtin-java
>     classes where applicable
>
>     12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is
>     deprecated. Instead, use mapreduce.job.jar
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.map.tasks.speculative.execution is deprecated. Instead, use
>     mapreduce.map.speculative
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
>     deprecated. Instead, use mapreduce.job.reduces
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.value.class is deprecated. Instead, use
>     mapreduce.map.output.value.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
>     deprecated. Instead, use mapreduce.job.map.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
>     deprecated. Instead, use mapreduce.job.name
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.inputformat.class is deprecated. Instead, use
>     mapreduce.job.inputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
>     deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.outputformat.class is deprecated. Instead, use
>     mapreduce.job.outputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
>     deprecated. Instead, use mapreduce.job.maps
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.key.class is deprecated. Instead, use
>     mapreduce.map.output.key.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
>     deprecated. Instead, use mapreduce.job.working.dir
>
>     12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
>     application application_1345831938927_0039 to ResourceManager at
>     baby20/10.1.1.40:8040
>
>     12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
>     http://baby20:8088/proxy/application_1345831938927_0039/
>
>     12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id:
>     job_1345831938927_0039
>
>     12/08/28 23:32:37 INFO mapreduce.Job: Running job:
>     job_1345831938927_0039
>
>     12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
>     running in uber mode : false
>
>     12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
>     attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
>     Error: java.io.IOException: File copy failed:
>     hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 -->
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
>     at java.security.AccessController.doPrivileged(Native Method)
>
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>
>     at
>     org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
>     Caused by: java.io.IOException: Couldn't run retriable-command:
>     Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
>     ... 10 more
>
>     Caused by:
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
>     java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
>     ... 11 more
>
>     Caused by: java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
>     at java.io.DataInputStream.read(DataInputStream.java:132)
>
>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
>     at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
>     at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
>     at
>     org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
>     ... 16 more
>
>
>     说明: 图像已被发件人删除。
>     <http://www.uci.cu/>
>
>
>
> 说明: 图像已被发件人删除。 <http://www.uci.cu/>
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: 答复: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Have you searched in the Hadoop Jira if there is a seemed problem to this?
This can be a bug.

El 28/08/2012 12:30, Tao escribió:
>
> Hi,
>
> Thanks for your reply.
>
> I have tried between 1.0.3s and 2.0.1s.
>
> Both are failed.
>
> Path contain Chinese character.
>
> *1.0.3 hftp to 1.0.3 hdfs, exception inform is below.*
>
> 12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
>
> 12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
>
> 12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
>
> 12/08/29 00:24:24 INFO mapred.JobClient: Running job:
> job_201208101345_2203
>
> 12/08/29 00:24:25 INFO mapred.JobClient: map 0% reduce 0%
>
> 12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_0, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_1, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_2, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job complete:
> job_201208101345_2203
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Counters
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=66844
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all
> reduces waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all maps
> waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Launched map tasks=4
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Failed map tasks=1
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map
> Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask:
> task_201208101345_2203_m_000000
>
> With failures, global counters are inaccurate; consider running with -i
>
> Copy failed: java.io.IOException: Job failed!
>
> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
>
> at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
>
> at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>
> at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)
>
> *2.0.1 hftp to 2.0.1 hdfs, exception inform is below.*
>
> 12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
>
> 12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
>
> 12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043
> running in uber mode : false
>
> 12/08/29 00:20:14 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0043_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed:
> hftp://baby20:50070/tmp/??.log/add.csv -->
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://baby20:50070/tmp/中文.log/add.csv to
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr
>
> *发件人:*Marcos Ortiz [mailto:mlortiz@uci.cu]
> *发 送时间:*2012年8月28日23:53
> *收件人:*user@hadoop.apache.org
> *抄送:*Tao
> *主题:*Re: distcp error.
>
> Hi, Tao. This problem is only with 2.0.1 or with the two versions?
> Have you tried to use distcp from 1.0.3 to 1.0.3?
>
> El 28/08/2012 11:36, Tao escribió:
>
>     Hi, all
>
>     I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
>     When the file path(or file name) contain Chinese character, an
>     exception will throw. Like below. I need some help about this.
>
>     Thanks.
>
>     [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
>     /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
>     //10.xx.xx.bb:54310/tmp/distcp_test14
>
>     12/08/28 23:32:31 INFO tools.DistCp: Input Options:
>     DistCpOptions{atomicCommit=false, syncFolder=false,
>     deleteMissing=false, ignoreFailures=true, maxMaps=14,
>     sslConfigurationFile='null', copyStrategy='uniformsize',
>     sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp
>     /中文路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
>     12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path:
>     /tmp/distcp.log
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is
>     deprecated. Instead, use mapreduce.task.io.sort.mb
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
>     deprecated. Instead, use mapreduce.task.io.sort.factor
>
>     12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
>     native-hadoop library for your platform... using builtin-java
>     classes where applicable
>
>     12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is
>     deprecated. Instead, use mapreduce.job.jar
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.map.tasks.speculative.execution is deprecated. Instead, use
>     mapreduce.map.speculative
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
>     deprecated. Instead, use mapreduce.job.reduces
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.value.class is deprecated. Instead, use
>     mapreduce.map.output.value.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
>     deprecated. Instead, use mapreduce.job.map.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
>     deprecated. Instead, use mapreduce.job.name
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.inputformat.class is deprecated. Instead, use
>     mapreduce.job.inputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
>     deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.outputformat.class is deprecated. Instead, use
>     mapreduce.job.outputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
>     deprecated. Instead, use mapreduce.job.maps
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.key.class is deprecated. Instead, use
>     mapreduce.map.output.key.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
>     deprecated. Instead, use mapreduce.job.working.dir
>
>     12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
>     application application_1345831938927_0039 to ResourceManager at
>     baby20/10.1.1.40:8040
>
>     12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
>     http://baby20:8088/proxy/application_1345831938927_0039/
>
>     12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id:
>     job_1345831938927_0039
>
>     12/08/28 23:32:37 INFO mapreduce.Job: Running job:
>     job_1345831938927_0039
>
>     12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
>     running in uber mode : false
>
>     12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
>     attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
>     Error: java.io.IOException: File copy failed:
>     hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 -->
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
>     at java.security.AccessController.doPrivileged(Native Method)
>
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>
>     at
>     org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
>     Caused by: java.io.IOException: Couldn't run retriable-command:
>     Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
>     ... 10 more
>
>     Caused by:
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
>     java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
>     ... 11 more
>
>     Caused by: java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
>     at java.io.DataInputStream.read(DataInputStream.java:132)
>
>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
>     at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
>     at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
>     at
>     org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
>     ... 16 more
>
>
>     说明: 图像已被发件人删除。
>     <http://www.uci.cu/>
>
>
>
> 说明: 图像已被发件人删除。 <http://www.uci.cu/>
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: 答复: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Have you searched in the Hadoop Jira if there is a seemed problem to this?
This can be a bug.

El 28/08/2012 12:30, Tao escribió:
>
> Hi,
>
> Thanks for your reply.
>
> I have tried between 1.0.3s and 2.0.1s.
>
> Both are failed.
>
> Path contain Chinese character.
>
> *1.0.3 hftp to 1.0.3 hdfs, exception inform is below.*
>
> 12/08/29 00:24:23 INFO tools.DistCp: sourcePathsCount=2
>
> 12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1
>
> 12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k
>
> 12/08/29 00:24:24 INFO mapred.JobClient: Running job:
> job_201208101345_2203
>
> 12/08/29 00:24:25 INFO mapred.JobClient: map 0% reduce 0%
>
> 12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_0, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_1, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
> attempt_201208101345_2203_m_000000_2, Status : FAILED
>
> java.io.IOException: Copied: 0 Skipped: 0 Failed: 1
>
> at org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)
>
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
>
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job complete:
> job_201208101345_2203
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Counters
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=66844
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all
> reduces waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Total time spent by all maps
> waiting after reserving slots (ms)=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Launched map tasks=4
>
> 12/08/29 00:25:40 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Failed map tasks=1
>
> 12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map
> Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask:
> task_201208101345_2203_m_000000
>
> With failures, global counters are inaccurate; consider running with -i
>
> Copy failed: java.io.IOException: Job failed!
>
> at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)
>
> at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)
>
> at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>
> at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)
>
> *2.0.1 hftp to 2.0.1 hdfs, exception inform is below.*
>
> 12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043
>
> 12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043
>
> 12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043
> running in uber mode : false
>
> 12/08/29 00:20:14 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0043_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed:
> hftp://baby20:50070/tmp/??.log/add.csv -->
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://baby20:50070/tmp/中文.log/add.csv to
> hdfs://baby20:54310/tmp4/add.csv
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 400
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stdout
>
> 12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
> returned HTTP response code: 400 for URL:
> http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_0043_m_000000_0&filter=stderr
>
> *发件人:*Marcos Ortiz [mailto:mlortiz@uci.cu]
> *发 送时间:*2012年8月28日23:53
> *收件人:*user@hadoop.apache.org
> *抄送:*Tao
> *主题:*Re: distcp error.
>
> Hi, Tao. This problem is only with 2.0.1 or with the two versions?
> Have you tried to use distcp from 1.0.3 to 1.0.3?
>
> El 28/08/2012 11:36, Tao escribió:
>
>     Hi, all
>
>     I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
>     When the file path(or file name) contain Chinese character, an
>     exception will throw. Like below. I need some help about this.
>
>     Thanks.
>
>     [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
>     /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
>     //10.xx.xx.bb:54310/tmp/distcp_test14
>
>     12/08/28 23:32:31 INFO tools.DistCp: Input Options:
>     DistCpOptions{atomicCommit=false, syncFolder=false,
>     deleteMissing=false, ignoreFailures=true, maxMaps=14,
>     sslConfigurationFile='null', copyStrategy='uniformsize',
>     sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp
>     /中文路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
>     12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path:
>     /tmp/distcp.log
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is
>     deprecated. Instead, use mapreduce.task.io.sort.mb
>
>     12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
>     deprecated. Instead, use mapreduce.task.io.sort.factor
>
>     12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
>     native-hadoop library for your platform... using builtin-java
>     classes where applicable
>
>     12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is
>     deprecated. Instead, use mapreduce.job.jar
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.map.tasks.speculative.execution is deprecated. Instead, use
>     mapreduce.map.speculative
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
>     deprecated. Instead, use mapreduce.job.reduces
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.value.class is deprecated. Instead, use
>     mapreduce.map.output.value.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
>     deprecated. Instead, use mapreduce.job.map.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
>     deprecated. Instead, use mapreduce.job.name
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.inputformat.class is deprecated. Instead, use
>     mapreduce.job.inputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
>     deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapreduce.outputformat.class is deprecated. Instead, use
>     mapreduce.job.outputformat.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
>     deprecated. Instead, use mapreduce.job.maps
>
>     12/08/28 23:32:36 WARN conf.Configuration:
>     mapred.mapoutput.key.class is deprecated. Instead, use
>     mapreduce.map.output.key.class
>
>     12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
>     deprecated. Instead, use mapreduce.job.working.dir
>
>     12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
>     application application_1345831938927_0039 to ResourceManager at
>     baby20/10.1.1.40:8040
>
>     12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
>     http://baby20:8088/proxy/application_1345831938927_0039/
>
>     12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id:
>     job_1345831938927_0039
>
>     12/08/28 23:32:37 INFO mapreduce.Job: Running job:
>     job_1345831938927_0039
>
>     12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
>     running in uber mode : false
>
>     12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
>     12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
>     attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
>     Error: java.io.IOException: File copy failed:
>     hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 -->
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
>     at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
>     at java.security.AccessController.doPrivileged(Native Method)
>
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>
>     at
>     org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
>     Caused by: java.io.IOException: Couldn't run retriable-command:
>     Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
>     hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
>     at
>     org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
>     ... 10 more
>
>     Caused by:
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
>     java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
>     at
>     org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
>     ... 11 more
>
>     Caused by: java.io.IOException: HTTP_OK expected, received 500
>
>     at
>     org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
>     at
>     org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
>     at java.io.DataInputStream.read(DataInputStream.java:132)
>
>     at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
>     at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
>     at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
>     at
>     org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
>     at
>     org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
>     ... 16 more
>
>
>     说明: 图像已被发件人删除。
>     <http://www.uci.cu/>
>
>
>
> 说明: 图像已被发件人删除。 <http://www.uci.cu/>
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

答复: distcp error.

Posted by Tao <zt...@outlook.com>.
Hi,

         Thanks for your reply.

         I have tried between 1.0.3s and 2.0.1s.

         Both are failed.

 

         Path contain Chinese character.

         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.

                                     12/08/29 00:24:23 INFO tools.DistCp:
sourcePathsCount=2

12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1

12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k

12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203

12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%

12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_0, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_1, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_2, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203

12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6

12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters 

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0

12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1

12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks
exceeded allowed limit. FailedCount: 1. LastFailedTask:
task_201208101345_2203_m_000000

With failures, global counters are inaccurate; consider running with -i

Copy failed: java.io.IOException: Job failed!

        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)

        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)

        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)

        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)

                                     

 

         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.

12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043

12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043

12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in
uber mode : false

12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%

12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0043_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed:
hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://baby20:50070/tmp/中文.log/add.csv to hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stdout

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stderr

 

         

 

发件人: Marcos Ortiz [mailto:mlortiz@uci.cu] 
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org
抄送: Tao
主题: Re: distcp error.

 

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:

Hi, all

         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.

         When the file path(or file name) contain Chinese character, an
exception will throw. Like below. I need some help about this.

         Thanks.

         

 

 

 

[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
/tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试
hdfs://10.xx.xx.bb:54310/tmp/distcp_test14

12/08/28 23:32:31 INFO tools.DistCp: Input Options:
DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
copyStrategy='uniformsize', sourceFileListing=null,
sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试],
targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}

12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log

12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
Instead, use mapreduce.task.io.sort.mb

12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
Instead, use mapreduce.task.io.sort.factor

12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1

12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
Instead, use mapreduce.job.jar

12/08/28 23:32:36 WARN conf.Configuration:
mapred.map.tasks.speculative.execution is deprecated. Instead, use
mapreduce.map.speculative

12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
deprecated. Instead, use mapreduce.map.output.value.class

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
deprecated. Instead, use mapreduce.job.map.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
Instead, use mapreduce.job.name

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
deprecated. Instead, use mapreduce.job.inputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated.
Instead, use mapreduce.output.fileoutputformat.outputdir

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
deprecated. Instead, use mapreduce.job.outputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
Instead, use mapreduce.job.maps

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
deprecated. Instead, use mapreduce.map.output.key.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated.
Instead, use mapreduce.job.working.dir

12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040

12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
http://baby20:8088/proxy/application_1345831938927_0039/

12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039

12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039

12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in
uber mode : false

12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0039_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/中
文路径测试/part-r-00017 -->
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

 

 


 <http://www.uci.cu/> 说明: 图像已被发件人删除。






 <http://www.uci.cu/> 说明: 图像已被发件人删除。

 


答复: distcp error.

Posted by Tao <zt...@outlook.com>.
Hi,

         Thanks for your reply.

         I have tried between 1.0.3s and 2.0.1s.

         Both are failed.

 

         Path contain Chinese character.

         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.

                                     12/08/29 00:24:23 INFO tools.DistCp:
sourcePathsCount=2

12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1

12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k

12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203

12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%

12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_0, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_1, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_2, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203

12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6

12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters 

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0

12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1

12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks
exceeded allowed limit. FailedCount: 1. LastFailedTask:
task_201208101345_2203_m_000000

With failures, global counters are inaccurate; consider running with -i

Copy failed: java.io.IOException: Job failed!

        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)

        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)

        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)

        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)

                                     

 

         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.

12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043

12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043

12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in
uber mode : false

12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%

12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0043_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed:
hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://baby20:50070/tmp/中文.log/add.csv to hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stdout

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stderr

 

         

 

发件人: Marcos Ortiz [mailto:mlortiz@uci.cu] 
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org
抄送: Tao
主题: Re: distcp error.

 

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:

Hi, all

         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.

         When the file path(or file name) contain Chinese character, an
exception will throw. Like below. I need some help about this.

         Thanks.

         

 

 

 

[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
/tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试
hdfs://10.xx.xx.bb:54310/tmp/distcp_test14

12/08/28 23:32:31 INFO tools.DistCp: Input Options:
DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
copyStrategy='uniformsize', sourceFileListing=null,
sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试],
targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}

12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log

12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
Instead, use mapreduce.task.io.sort.mb

12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
Instead, use mapreduce.task.io.sort.factor

12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1

12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
Instead, use mapreduce.job.jar

12/08/28 23:32:36 WARN conf.Configuration:
mapred.map.tasks.speculative.execution is deprecated. Instead, use
mapreduce.map.speculative

12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
deprecated. Instead, use mapreduce.map.output.value.class

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
deprecated. Instead, use mapreduce.job.map.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
Instead, use mapreduce.job.name

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
deprecated. Instead, use mapreduce.job.inputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated.
Instead, use mapreduce.output.fileoutputformat.outputdir

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
deprecated. Instead, use mapreduce.job.outputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
Instead, use mapreduce.job.maps

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
deprecated. Instead, use mapreduce.map.output.key.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated.
Instead, use mapreduce.job.working.dir

12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040

12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
http://baby20:8088/proxy/application_1345831938927_0039/

12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039

12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039

12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in
uber mode : false

12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0039_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/中
文路径测试/part-r-00017 -->
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

 

 


 <http://www.uci.cu/> 说明: 图像已被发件人删除。






 <http://www.uci.cu/> 说明: 图像已被发件人删除。

 


答复: distcp error.

Posted by Tao <zt...@outlook.com>.
Hi,

         Thanks for your reply.

         I have tried between 1.0.3s and 2.0.1s.

         Both are failed.

 

         Path contain Chinese character.

         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.

                                     12/08/29 00:24:23 INFO tools.DistCp:
sourcePathsCount=2

12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1

12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k

12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203

12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%

12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_0, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_1, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_2, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203

12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6

12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters 

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0

12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1

12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks
exceeded allowed limit. FailedCount: 1. LastFailedTask:
task_201208101345_2203_m_000000

With failures, global counters are inaccurate; consider running with -i

Copy failed: java.io.IOException: Job failed!

        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)

        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)

        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)

        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)

                                     

 

         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.

12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043

12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043

12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in
uber mode : false

12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%

12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0043_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed:
hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://baby20:50070/tmp/中文.log/add.csv to hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stdout

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stderr

 

         

 

发件人: Marcos Ortiz [mailto:mlortiz@uci.cu] 
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org
抄送: Tao
主题: Re: distcp error.

 

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:

Hi, all

         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.

         When the file path(or file name) contain Chinese character, an
exception will throw. Like below. I need some help about this.

         Thanks.

         

 

 

 

[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
/tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试
hdfs://10.xx.xx.bb:54310/tmp/distcp_test14

12/08/28 23:32:31 INFO tools.DistCp: Input Options:
DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
copyStrategy='uniformsize', sourceFileListing=null,
sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试],
targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}

12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log

12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
Instead, use mapreduce.task.io.sort.mb

12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
Instead, use mapreduce.task.io.sort.factor

12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1

12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
Instead, use mapreduce.job.jar

12/08/28 23:32:36 WARN conf.Configuration:
mapred.map.tasks.speculative.execution is deprecated. Instead, use
mapreduce.map.speculative

12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
deprecated. Instead, use mapreduce.map.output.value.class

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
deprecated. Instead, use mapreduce.job.map.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
Instead, use mapreduce.job.name

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
deprecated. Instead, use mapreduce.job.inputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated.
Instead, use mapreduce.output.fileoutputformat.outputdir

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
deprecated. Instead, use mapreduce.job.outputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
Instead, use mapreduce.job.maps

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
deprecated. Instead, use mapreduce.map.output.key.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated.
Instead, use mapreduce.job.working.dir

12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040

12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
http://baby20:8088/proxy/application_1345831938927_0039/

12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039

12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039

12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in
uber mode : false

12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0039_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/中
文路径测试/part-r-00017 -->
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

 

 


 <http://www.uci.cu/> 说明: 图像已被发件人删除。






 <http://www.uci.cu/> 说明: 图像已被发件人删除。

 


答复: distcp error.

Posted by Tao <zt...@outlook.com>.
Hi,

         Thanks for your reply.

         I have tried between 1.0.3s and 2.0.1s.

         Both are failed.

 

         Path contain Chinese character.

         1.0.3 hftp to 1.0.3 hdfs, exception inform is below.

                                     12/08/29 00:24:23 INFO tools.DistCp:
sourcePathsCount=2

12/08/29 00:24:23 INFO tools.DistCp: filesToCopyCount=1

12/08/29 00:24:23 INFO tools.DistCp: bytesToCopyCount=1.2k

12/08/29 00:24:24 INFO mapred.JobClient: Running job: job_201208101345_2203

12/08/29 00:24:25 INFO mapred.JobClient:  map 0% reduce 0%

12/08/29 00:24:46 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_0, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:04 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_1, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:19 INFO mapred.JobClient: Task Id :
attempt_201208101345_2203_m_000000_2, Status : FAILED

java.io.IOException: Copied: 0 Skipped: 0 Failed: 1

        at
org.apache.hadoop.tools.DistCp$CopyFilesMapper.close(DistCp.java:582)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

 

12/08/29 00:25:40 INFO mapred.JobClient: Job complete: job_201208101345_2203

12/08/29 00:25:40 INFO mapred.JobClient: Counters: 6

12/08/29 00:25:40 INFO mapred.JobClient:   Job Counters 

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=66844

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all reduces
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0

12/08/29 00:25:40 INFO mapred.JobClient:     Launched map tasks=4

12/08/29 00:25:40 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0

12/08/29 00:25:40 INFO mapred.JobClient:     Failed map tasks=1

12/08/29 00:25:40 INFO mapred.JobClient: Job Failed: # of failed Map Tasks
exceeded allowed limit. FailedCount: 1. LastFailedTask:
task_201208101345_2203_m_000000

With failures, global counters are inaccurate; consider running with -i

Copy failed: java.io.IOException: Job failed!

        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1265)

        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:667)

        at org.apache.hadoop.tools.DistCp.run(DistCp.java:881)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)

        at org.apache.hadoop.tools.DistCp.main(DistCp.java:908)

                                     

 

         2.0.1 hftp to 2.0.1 hdfs, exception inform is below.

12/08/29 00:20:06 INFO tools.DistCp: DistCp job-id: job_1345831938927_0043

12/08/29 00:20:06 INFO mapreduce.Job: Running job: job_1345831938927_0043

12/08/29 00:20:14 INFO mapreduce.Job: Job job_1345831938927_0043 running in
uber mode : false

12/08/29 00:20:14 INFO mapreduce.Job:  map 0% reduce 0%

12/08/29 00:20:23 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0043_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed:
hftp://baby20:50070/tmp/??.log/add.csv --> hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://baby20:50070/tmp/中文.log/add.csv to hdfs://baby20:54310/tmp4/add.csv

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 400

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stdout

12/08/29 00:20:23 WARN mapreduce.Job: Error reading task output Server
returned HTTP response code: 400 for URL:
http://baby19:8080/tasklog?plaintext=true&attemptid=attempt_1345831938927_00
43_m_000000_0&filter=stderr

 

         

 

发件人: Marcos Ortiz [mailto:mlortiz@uci.cu] 
发送时间: 2012年8月28日 23:53
收件人: user@hadoop.apache.org
抄送: Tao
主题: Re: distcp error.

 

Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:

Hi, all

         I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.

         When the file path(or file name) contain Chinese character, an
exception will throw. Like below. I need some help about this.

         Thanks.

         

 

 

 

[hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
/tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试
hdfs://10.xx.xx.bb:54310/tmp/distcp_test14

12/08/28 23:32:31 INFO tools.DistCp: Input Options:
DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
copyStrategy='uniformsize', sourceFileListing=null,
sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试],
targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}

12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log

12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
Instead, use mapreduce.task.io.sort.mb

12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
Instead, use mapreduce.task.io.sort.factor

12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable

12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1

12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
Instead, use mapreduce.job.jar

12/08/28 23:32:36 WARN conf.Configuration:
mapred.map.tasks.speculative.execution is deprecated. Instead, use
mapreduce.map.speculative

12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
deprecated. Instead, use mapreduce.job.reduces

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
deprecated. Instead, use mapreduce.map.output.value.class

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
deprecated. Instead, use mapreduce.job.map.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
Instead, use mapreduce.job.name

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
deprecated. Instead, use mapreduce.job.inputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is deprecated.
Instead, use mapreduce.output.fileoutputformat.outputdir

12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
deprecated. Instead, use mapreduce.job.outputformat.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
Instead, use mapreduce.job.maps

12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
deprecated. Instead, use mapreduce.map.output.key.class

12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is deprecated.
Instead, use mapreduce.job.working.dir

12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040

12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
http://baby20:8088/proxy/application_1345831938927_0039/

12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039

12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039

12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running in
uber mode : false

12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%

12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
attempt_1345831938927_0039_m_000000_0, Status : FAILED

Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/中
文路径测试/part-r-00017 -->
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
262)

        at
org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)

        at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.ja
va:1232)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)

Caused by: java.io.IOException: Couldn't run retriable-command: Copying
hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
101)

        at
org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:
258)

        ... 10 more

Caused by:
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:201)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableF
ileCopyCommand.java:167)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(Retria
bleFileCopyCommand.java:112)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFile
CopyCommand.java:90)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableF
ileCopyCommand.java:71)

        at
org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:
87)

        ... 11 more

Caused by: java.io.IOException: HTTP_OK expected, received 500

        at
org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCo
de(HftpFileSystem.java:381)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputSt
ream.java:121)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStr
eam.java:103)

        at
org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:1
58)

        at java.io.DataInputStream.read(DataInputStream.java:132)

        at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)

        at java.io.BufferedInputStream.read(BufferedInputStream.java:317)

        at java.io.FilterInputStream.read(FilterInputStream.java:90)

        at
org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.
java:70)

        at
org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableF
ileCopyCommand.java:198)

        ... 16 more

 

 

 


 <http://www.uci.cu/> 说明: 图像已被发件人删除。






 <http://www.uci.cu/> 说明: 图像已被发件人删除。

 


Re: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:
>
> Hi, all
>
> I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
> When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.
>
> Thanks.
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
> //10.xx.xx.bb:54310/tmp/distcp_test14
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false,
> deleteMissing=false, ignoreFailures=true, maxMaps=14,
> sslConfigurationFile='null', copyStrategy='uniformsize',
> sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文
> 路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
> deprecated. Instead, use mapreduce.task.io.sort.factor
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes
> where applicable
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.mapoutput.value.class is deprecated. Instead, use
> mapreduce.map.output.value.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
> deprecated. Instead, use mapreduce.job.name
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class
> is deprecated. Instead, use mapreduce.job.inputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapreduce.outputformat.class is deprecated. Instead, use
> mapreduce.job.outputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
> deprecated. Instead, use mapreduce.job.maps
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class
> is deprecated. Instead, use mapreduce.map.output.key.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
> application application_1345831938927_0039 to ResourceManager at
> baby20/10.1.1.40:8040
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
> running in uber mode : false
>
> 12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070
> /tmp/中文路径测试/part-r-00017 -->
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: distcp error.

Posted by 심병렬 <si...@gmail.com>.
Umsubscribe
2012. 8. 29. 오전 12:44에 "Tao" <zt...@outlook.com>님이 작성:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>

Re: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:
>
> Hi, all
>
> I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
> When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.
>
> Thanks.
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
> //10.xx.xx.bb:54310/tmp/distcp_test14
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false,
> deleteMissing=false, ignoreFailures=true, maxMaps=14,
> sslConfigurationFile='null', copyStrategy='uniformsize',
> sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文
> 路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
> deprecated. Instead, use mapreduce.task.io.sort.factor
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes
> where applicable
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.mapoutput.value.class is deprecated. Instead, use
> mapreduce.map.output.value.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
> deprecated. Instead, use mapreduce.job.name
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class
> is deprecated. Instead, use mapreduce.job.inputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapreduce.outputformat.class is deprecated. Instead, use
> mapreduce.job.outputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
> deprecated. Instead, use mapreduce.job.maps
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class
> is deprecated. Instead, use mapreduce.map.output.key.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
> application application_1345831938927_0039 to ResourceManager at
> baby20/10.1.1.40:8040
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
> running in uber mode : false
>
> 12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070
> /tmp/中文路径测试/part-r-00017 -->
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: distcp error.

Posted by Dan Young <da...@gmail.com>.
I was just reading about this in the Hadoop definitive guide last night.
Need to be the same version. You can try hftp between versions

Regards

Dano
On Aug 28, 2012 9:44 AM, "Tao" <zt...@outlook.com> wrote:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>

Re: distcp error.

Posted by Dan Young <da...@gmail.com>.
I was just reading about this in the Hadoop definitive guide last night.
Need to be the same version. You can try hftp between versions

Regards

Dano
On Aug 28, 2012 9:44 AM, "Tao" <zt...@outlook.com> wrote:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>

Re: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:
>
> Hi, all
>
> I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
> When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.
>
> Thanks.
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
> //10.xx.xx.bb:54310/tmp/distcp_test14
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false,
> deleteMissing=false, ignoreFailures=true, maxMaps=14,
> sslConfigurationFile='null', copyStrategy='uniformsize',
> sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文
> 路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
> deprecated. Instead, use mapreduce.task.io.sort.factor
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes
> where applicable
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.mapoutput.value.class is deprecated. Instead, use
> mapreduce.map.output.value.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
> deprecated. Instead, use mapreduce.job.name
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class
> is deprecated. Instead, use mapreduce.job.inputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapreduce.outputformat.class is deprecated. Instead, use
> mapreduce.job.outputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
> deprecated. Instead, use mapreduce.job.maps
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class
> is deprecated. Instead, use mapreduce.map.output.key.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
> application application_1345831938927_0039 to ResourceManager at
> baby20/10.1.1.40:8040
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
> running in uber mode : false
>
> 12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070
> /tmp/中文路径测试/part-r-00017 -->
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: distcp error.

Posted by Marcos Ortiz <ml...@uci.cu>.
Hi, Tao. This problem is only with 2.0.1 or with the two versions?
Have you tried to use distcp from 1.0.3 to 1.0.3?

El 28/08/2012 11:36, Tao escribió:
>
> Hi, all
>
> I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.
>
> When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.
>
> Thanks.
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试hdfs:
> //10.xx.xx.bb:54310/tmp/distcp_test14
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false,
> deleteMissing=false, ignoreFailures=true, maxMaps=14,
> sslConfigurationFile='null', copyStrategy='uniformsize',
> sourceFileListing=null, sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文
> 路径测试], targetPath=hdfs://10.xx.xx.bb:54310/tmp/distcp_test14}
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is
> deprecated. Instead, use mapreduce.task.io.sort.factor
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes
> where applicable
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.mapoutput.value.class is deprecated. Instead, use
> mapreduce.map.output.value.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is
> deprecated. Instead, use mapreduce.job.name
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class
> is deprecated. Instead, use mapreduce.job.inputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapreduce.outputformat.class is deprecated. Instead, use
> mapreduce.job.outputformat.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is
> deprecated. Instead, use mapreduce.job.maps
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class
> is deprecated. Instead, use mapreduce.map.output.key.class
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted
> application application_1345831938927_0039 to ResourceManager at
> baby20/10.1.1.40:8040
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039
> running in uber mode : false
>
> 12/08/28 23:32:50 INFO mapreduce.Job: map 0% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: map 100% reduce 0%
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070
> /tmp/中文路径测试/part-r-00017 -->
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)
>
> at org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)
>
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
>
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:396)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
>
> Caused by: java.io.IOException: Couldn't run retriable-command:
> Copying hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to
> hdfs://10.1.1.40:54310/tmp/distcp_test14/part-r-00017
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
>
> at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
>
> ... 10 more
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
>
> at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
>
> ... 11 more
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500
>
> at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
>
> at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
>
> at java.io.DataInputStream.read(DataInputStream.java:132)
>
> at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>
> at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>
> at java.io.FilterInputStream.read(FilterInputStream.java:90)
>
> at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
>
> at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
>
> ... 16 more
>
>
>
> <http://www.uci.cu/>




10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: distcp error.

Posted by Dan Young <da...@gmail.com>.
I was just reading about this in the Hadoop definitive guide last night.
Need to be the same version. You can try hftp between versions

Regards

Dano
On Aug 28, 2012 9:44 AM, "Tao" <zt...@outlook.com> wrote:

> Hi, all****
>
>          I use distcp copying data from hadoop1.0.3 to hadoop 2.0.1.****
>
>          When the file path(or file name) contain Chinese character, an
> exception will throw. Like below. I need some help about this.****
>
>          Thanks.****
>
>          ****
>
> ** **
>
> ** **
>
> ** **
>
> [hdfs@host ~]$ hadoop distcp -i -prbugp -m 14 -overwrite -log
> /tmp/distcp.log hftp://10.xx.xx.aa:50070/tmp/中文路径测试 hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14****
>
> 12/08/28 23:32:31 INFO tools.DistCp: Input Options:
> DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false,
> ignoreFailures=true, maxMaps=14, sslConfigurationFile='null',
> copyStrategy='uniformsize', sourceFileListing=null,
> sourcePaths=[hftp://10.xx.xx.aa:50070/tmp/中文路径测试], targetPath=hdfs://
> 10.xx.xx.bb:54310/tmp/distcp_test14}****
>
> 12/08/28 23:32:33 INFO tools.DistCp: DistCp job log path: /tmp/distcp.log*
> ***
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.mb is deprecated.
> Instead, use mapreduce.task.io.sort.mb****
>
> 12/08/28 23:32:34 WARN conf.Configuration: io.sort.factor is deprecated.
> Instead, use mapreduce.task.io.sort.factor****
>
> 12/08/28 23:32:34 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable**
> **
>
> 12/08/28 23:32:36 INFO mapreduce.JobSubmitter: number of splits:1****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.jar is deprecated.
> Instead, use mapreduce.job.jar****
>
> 12/08/28 23:32:36 WARN conf.Configuration:
> mapred.map.tasks.speculative.execution is deprecated. Instead, use
> mapreduce.map.speculative****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.reduce.tasks is
> deprecated. Instead, use mapreduce.job.reduces****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.value.class is
> deprecated. Instead, use mapreduce.map.output.value.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.map.class is
> deprecated. Instead, use mapreduce.job.map.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.job.name is deprecated.
> Instead, use mapreduce.job.name****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.inputformat.class is
> deprecated. Instead, use mapreduce.job.inputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.output.dir is
> deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapreduce.outputformat.class is
> deprecated. Instead, use mapreduce.job.outputformat.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.map.tasks is deprecated.
> Instead, use mapreduce.job.maps****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.mapoutput.key.class is
> deprecated. Instead, use mapreduce.map.output.key.class****
>
> 12/08/28 23:32:36 WARN conf.Configuration: mapred.working.dir is
> deprecated. Instead, use mapreduce.job.working.dir****
>
> 12/08/28 23:32:37 INFO mapred.ResourceMgrDelegate: Submitted application
> application_1345831938927_0039 to ResourceManager at baby20/10.1.1.40:8040
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: The url to track the job:
> http://baby20:8088/proxy/application_1345831938927_0039/****
>
> 12/08/28 23:32:37 INFO tools.DistCp: DistCp job-id: job_1345831938927_0039
> ****
>
> 12/08/28 23:32:37 INFO mapreduce.Job: Running job: job_1345831938927_0039*
> ***
>
> 12/08/28 23:32:50 INFO mapreduce.Job: Job job_1345831938927_0039 running
> in uber mode : false****
>
> 12/08/28 23:32:50 INFO mapreduce.Job:  map 0% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job:  map 100% reduce 0%****
>
> 12/08/28 23:33:00 INFO mapreduce.Job: Task Id :
> attempt_1345831938927_0039_m_000000_0, Status : FAILED****
>
> Error: java.io.IOException: File copy failed: hftp://10.1.1.26:50070/tmp/
> 中文路径测试/part-r-00017 --> hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:262)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:229)****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.map(CopyMapper.java:45)****
>
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)****
>
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> ****
>
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)****
>
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
> ****
>
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)****
>
> Caused by: java.io.IOException: Couldn't run retriable-command: Copying
> hftp://10.1.1.26:50070/tmp/中文路径测试/part-r-00017 to hdfs://
> 10.1.1.40:54310/tmp/distcp_test14/part-r-00017****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:101)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.CopyMapper.copyFileWithRetry(CopyMapper.java:258)
> ****
>
>         ... 10 more****
>
> Caused by:
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand$CopyReadException:
> java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:201)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyBytes(RetriableFileCopyCommand.java:167)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.copyToTmpFile(RetriableFileCopyCommand.java:112)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doCopy(RetriableFileCopyCommand.java:90)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.doExecute(RetriableFileCopyCommand.java:71)
> ****
>
>         at
> org.apache.hadoop.tools.util.RetriableCommand.execute(RetriableCommand.java:87)
> ****
>
>         ... 11 more****
>
> Caused by: java.io.IOException: HTTP_OK expected, received 500****
>
>         at
> org.apache.hadoop.hdfs.HftpFileSystem$RangeHeaderInputStream.checkResponseCode(HftpFileSystem.java:381)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.openInputStream(ByteRangeInputStream.java:121)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.getInputStream(ByteRangeInputStream.java:103)
> ****
>
>         at
> org.apache.hadoop.hdfs.ByteRangeInputStream.read(ByteRangeInputStream.java:158)
> ****
>
>         at java.io.DataInputStream.read(DataInputStream.java:132)****
>
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
> ****
>
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)*
> ***
>
>         at java.io.FilterInputStream.read(FilterInputStream.java:90)****
>
>         at
> org.apache.hadoop.tools.util.ThrottledInputStream.read(ThrottledInputStream.java:70)
> ****
>
>         at
> org.apache.hadoop.tools.mapred.RetriableFileCopyCommand.readBytes(RetriableFileCopyCommand.java:198)
> ****
>
>         ... 16 more****
>
> ** **
>
> ** **
>
> ** **
>