You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Milind Bhandarkar (JIRA)" <ji...@apache.org> on 2007/01/08 23:22:27 UTC

[jira] Created: (HADOOP-866) dfs -get should remove existing crc file if -crc is not specified

dfs -get should remove existing crc file if -crc is not specified
-----------------------------------------------------------------

                 Key: HADOOP-866
                 URL: https://issues.apache.org/jira/browse/HADOOP-866
             Project: Hadoop
          Issue Type: Bug
          Components: fs
    Affects Versions: 0.10.0
         Environment: All
            Reporter: Milind Bhandarkar
         Assigned To: Milind Bhandarkar
             Fix For: 0.11.0


When we added -crc option to dfs -get (aka dfs -copyToLocal) the absence of this command-line option implies not copying the crc file associated with hdfs file. However, if the checksum file already exists, it will not correspond with the newly copied data file, and opening it would cause checksum failure. The solution is to remove any existing crc file corresponding to the data file if -crc option is not given for -get. Patch forthcoming.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (HADOOP-866) dfs -get should remove existing crc file if -crc is not specified

Posted by "Milind Bhandarkar (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HADOOP-866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Milind Bhandarkar updated HADOOP-866:
-------------------------------------

    Status: Patch Available  (was: In Progress)

> dfs -get should remove existing crc file if -crc is not specified
> -----------------------------------------------------------------
>
>                 Key: HADOOP-866
>                 URL: https://issues.apache.org/jira/browse/HADOOP-866
>             Project: Hadoop
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 0.10.0
>         Environment: All
>            Reporter: Milind Bhandarkar
>         Assigned To: Milind Bhandarkar
>             Fix For: 0.11.0
>
>         Attachments: hadoop-get.patch
>
>
> When we added -crc option to dfs -get (aka dfs -copyToLocal) the absence of this command-line option implies not copying the crc file associated with hdfs file. However, if the checksum file already exists, it will not correspond with the newly copied data file, and opening it would cause checksum failure. The solution is to remove any existing crc file corresponding to the data file if -crc option is not given for -get. Patch forthcoming.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (HADOOP-866) dfs -get should remove existing crc file if -crc is not specified

Posted by "Milind Bhandarkar (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HADOOP-866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Milind Bhandarkar updated HADOOP-866:
-------------------------------------

    Attachment: hadoop-get.patch

> dfs -get should remove existing crc file if -crc is not specified
> -----------------------------------------------------------------
>
>                 Key: HADOOP-866
>                 URL: https://issues.apache.org/jira/browse/HADOOP-866
>             Project: Hadoop
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 0.10.0
>         Environment: All
>            Reporter: Milind Bhandarkar
>         Assigned To: Milind Bhandarkar
>             Fix For: 0.11.0
>
>         Attachments: hadoop-get.patch
>
>
> When we added -crc option to dfs -get (aka dfs -copyToLocal) the absence of this command-line option implies not copying the crc file associated with hdfs file. However, if the checksum file already exists, it will not correspond with the newly copied data file, and opening it would cause checksum failure. The solution is to remove any existing crc file corresponding to the data file if -crc option is not given for -get. Patch forthcoming.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Work started: (HADOOP-866) dfs -get should remove existing crc file if -crc is not specified

Posted by "Milind Bhandarkar (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HADOOP-866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Work on HADOOP-866 started by Milind Bhandarkar.

> dfs -get should remove existing crc file if -crc is not specified
> -----------------------------------------------------------------
>
>                 Key: HADOOP-866
>                 URL: https://issues.apache.org/jira/browse/HADOOP-866
>             Project: Hadoop
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 0.10.0
>         Environment: All
>            Reporter: Milind Bhandarkar
>         Assigned To: Milind Bhandarkar
>             Fix For: 0.11.0
>
>
> When we added -crc option to dfs -get (aka dfs -copyToLocal) the absence of this command-line option implies not copying the crc file associated with hdfs file. However, if the checksum file already exists, it will not correspond with the newly copied data file, and opening it would cause checksum failure. The solution is to remove any existing crc file corresponding to the data file if -crc option is not given for -get. Patch forthcoming.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Updated: (HADOOP-866) dfs -get should remove existing crc file if -crc is not specified

Posted by "Doug Cutting (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HADOOP-866?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Doug Cutting updated HADOOP-866:
--------------------------------

       Resolution: Fixed
    Fix Version/s:     (was: 0.11.0)
                   0.10.1
           Status: Resolved  (was: Patch Available)

I just fixed this.  Thanks, Milind!

> dfs -get should remove existing crc file if -crc is not specified
> -----------------------------------------------------------------
>
>                 Key: HADOOP-866
>                 URL: https://issues.apache.org/jira/browse/HADOOP-866
>             Project: Hadoop
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 0.10.0
>         Environment: All
>            Reporter: Milind Bhandarkar
>         Assigned To: Milind Bhandarkar
>             Fix For: 0.10.1
>
>         Attachments: hadoop-get.patch
>
>
> When we added -crc option to dfs -get (aka dfs -copyToLocal) the absence of this command-line option implies not copying the crc file associated with hdfs file. However, if the checksum file already exists, it will not correspond with the newly copied data file, and opening it would cause checksum failure. The solution is to remove any existing crc file corresponding to the data file if -crc option is not given for -get. Patch forthcoming.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] Commented: (HADOOP-866) dfs -get should remove existing crc file if -crc is not specified

Posted by "Hadoop QA (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HADOOP-866?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12463149 ] 

Hadoop QA commented on HADOOP-866:
----------------------------------

+1, because http://issues.apache.org/jira/secure/attachment/12348516/hadoop-get.patch applied and successfully tested against trunk revision r494172.

> dfs -get should remove existing crc file if -crc is not specified
> -----------------------------------------------------------------
>
>                 Key: HADOOP-866
>                 URL: https://issues.apache.org/jira/browse/HADOOP-866
>             Project: Hadoop
>          Issue Type: Bug
>          Components: fs
>    Affects Versions: 0.10.0
>         Environment: All
>            Reporter: Milind Bhandarkar
>         Assigned To: Milind Bhandarkar
>             Fix For: 0.11.0
>
>         Attachments: hadoop-get.patch
>
>
> When we added -crc option to dfs -get (aka dfs -copyToLocal) the absence of this command-line option implies not copying the crc file associated with hdfs file. However, if the checksum file already exists, it will not correspond with the newly copied data file, and opening it would cause checksum failure. The solution is to remove any existing crc file corresponding to the data file if -crc option is not given for -get. Patch forthcoming.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira