You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Vinay (JIRA)" <ji...@apache.org> on 2013/11/12 15:02:18 UTC

[jira] [Commented] (HADOOP-9114) After defined the dfs.checksum.type as the NULL, write file and hflush will through java.lang.ArrayIndexOutOfBoundsException

    [ https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13820110#comment-13820110 ] 

Vinay commented on HADOOP-9114:
-------------------------------

Thanks sathish for posting the patch.
+1, patch looks good to me.

> After defined the dfs.checksum.type as the NULL, write file and hflush will through java.lang.ArrayIndexOutOfBoundsException
> ----------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-9114
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9114
>             Project: Hadoop Common
>          Issue Type: Bug
>    Affects Versions: 2.0.1-alpha
>            Reporter: liuyang
>            Priority: Minor
>         Attachments: FSOutputSummer.java.patch, HADOOP-9114-001.patch
>
>
> when I test the characteristic parameter about dfs.checksum.type. The value can be defined as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or CRC32, but the client will through java.lang.ArrayIndexOutOfBoundsException when the value is configured NULL.



--
This message was sent by Atlassian JIRA
(v6.1#6144)