You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Alexander Aristov <al...@gmail.com> on 2008/11/17 13:57:46 UTC

hadoop 0.18.2 Checksum ok was sent and should not be sent again

Hi all
I upgraded hadoop to the 0.18.2 version and tried to run a test job,
distcopy from S3 to HDFS


I got a lot of info-level errors although the job successfully finished.

Any ideas? Can I simply suppress INFOs in log4j and forget about the error?


08/11/17 07:43:09 INFO fs.FSInputChecker: java.io.IOException: Checksum ok
was sent and should not be sent again
        at
org.apache.hadoop.dfs.DFSClient$BlockReader.read(DFSClient.java:863)
        at
org.apache.hadoop.dfs.DFSClient$DFSInputStream.readBuffer(DFSClient.java:1392)
        at
org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1428)
        at
org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1377)
        at java.io.DataInputStream.readInt(DataInputStream.java:372)
        at
org.apache.hadoop.io.SequenceFile$Reader.readRecordLength(SequenceFile.java:1898)
        at
org.apache.hadoop.io.SequenceFile$Reader.nextRaw(SequenceFile.java:1961)
        at
org.apache.hadoop.io.SequenceFile$Sorter$SortPass.run(SequenceFile.java:2399)
        at
org.apache.hadoop.io.SequenceFile$Sorter.sortPass(SequenceFile.java:2335)
        at
org.apache.hadoop.io.SequenceFile$Sorter.sort(SequenceFile.java:2285)
        at
org.apache.hadoop.io.SequenceFile$Sorter.sort(SequenceFile.java:2326)
        at org.apache.hadoop.tools.DistCp.checkDuplication(DistCp.java:1032)
        at org.apache.hadoop.tools.DistCp.setup(DistCp.java:1013)
        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:618)
        at org.apache.hadoop.tools.DistCp.run(DistCp.java:768)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.tools.DistCp.main(DistCp.java:788)





-- 
Best Regards
Alexander Aristov

Re: hadoop 0.18.2 Checksum ok was sent and should not be sent again

Posted by Raghu Angadi <ra...@yahoo-inc.com>.
Rong-en Fan wrote:
> I believe it was for debug purpose and was removed after 0.18.2 released.

Yes. It is fixed in 0.18.3 (HADOOP-4499).

Raghu.

> On Mon, Nov 17, 2008 at 8:57 PM, Alexander Aristov <
> alexander.aristov@gmail.com> wrote:
> 
>> Hi all
>> I upgraded hadoop to the 0.18.2 version and tried to run a test job,
>> distcopy from S3 to HDFS
>>
>>
>> I got a lot of info-level errors although the job successfully finished.
>>
>> Any ideas? Can I simply suppress INFOs in log4j and forget about the error?
>

Re: hadoop 0.18.2 Checksum ok was sent and should not be sent again

Posted by Rong-en Fan <gr...@gmail.com>.
I believe it was for debug purpose and was removed after 0.18.2 released.

On Mon, Nov 17, 2008 at 8:57 PM, Alexander Aristov <
alexander.aristov@gmail.com> wrote:

> Hi all
> I upgraded hadoop to the 0.18.2 version and tried to run a test job,
> distcopy from S3 to HDFS
>
>
> I got a lot of info-level errors although the job successfully finished.
>
> Any ideas? Can I simply suppress INFOs in log4j and forget about the error?
>
>
> 08/11/17 07:43:09 INFO fs.FSInputChecker: java.io.IOException: Checksum ok
> was sent and should not be sent again
>        at
> org.apache.hadoop.dfs.DFSClient$BlockReader.read(DFSClient.java:863)
>        at
>
> org.apache.hadoop.dfs.DFSClient$DFSInputStream.readBuffer(DFSClient.java:1392)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1428)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1377)
>        at java.io.DataInputStream.readInt(DataInputStream.java:372)
>        at
>
> org.apache.hadoop.io.SequenceFile$Reader.readRecordLength(SequenceFile.java:1898)
>        at
> org.apache.hadoop.io.SequenceFile$Reader.nextRaw(SequenceFile.java:1961)
>        at
>
> org.apache.hadoop.io.SequenceFile$Sorter$SortPass.run(SequenceFile.java:2399)
>        at
> org.apache.hadoop.io.SequenceFile$Sorter.sortPass(SequenceFile.java:2335)
>        at
> org.apache.hadoop.io.SequenceFile$Sorter.sort(SequenceFile.java:2285)
>        at
> org.apache.hadoop.io.SequenceFile$Sorter.sort(SequenceFile.java:2326)
>        at org.apache.hadoop.tools.DistCp.checkDuplication(DistCp.java:1032)
>        at org.apache.hadoop.tools.DistCp.setup(DistCp.java:1013)
>        at org.apache.hadoop.tools.DistCp.copy(DistCp.java:618)
>        at org.apache.hadoop.tools.DistCp.run(DistCp.java:768)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.apache.hadoop.tools.DistCp.main(DistCp.java:788)
>
>
>
>
>
> --
> Best Regards
> Alexander Aristov
>