You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Lars Ailo Bongo (JIRA)" <ji...@apache.org> on 2011/03/18 19:08:29 UTC
[jira] Commented: (HADOOP-7199) Crash due to reuse of checksum
files
[ https://issues.apache.org/jira/browse/HADOOP-7199?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13008553#comment-13008553 ]
Lars Ailo Bongo commented on HADOOP-7199:
-----------------------------------------
Title and description have been updated.
> Crash due to reuse of checksum files
> ------------------------------------
>
> Key: HADOOP-7199
> URL: https://issues.apache.org/jira/browse/HADOOP-7199
> Project: Hadoop Common
> Issue Type: Bug
> Components: fs
> Affects Versions: 0.20.2
> Environment: Cloudera CDH3B4 in pseudo mode on a Linux 2.6.32-28-generic #55-Ubuntu SMP x86_64 kernel, with Java HotSpot64-Bit Server VM (build 19.1-b02, mixed mode)
> Reporter: Lars Ailo Bongo
> Priority: Minor
>
> copyFromLocalFile crashes if a cheksum file exists on the local filesystem and the checksum does not match the file content. This will for example crash "hadoop -fs put ./foo ./foo" with a non-descriptive error.
> It is therefore not possible to do:
> 1. copyToLocalFile(hdfsFile, localFile) // creates checksum file
> 2. modify localFile
> 3. copyFromLocalFile(localFile, hdfsFile) // uses old checksum
> Solution: do not reuse checksum files, or add a parameter to copyFromLocalFile that specifies that checksum files should not be reused.
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira