You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Brock Noland (JIRA)" <ji...@apache.org> on 2014/03/29 16:31:18 UTC

[jira] [Commented] (HIVE-6784) parquet-hive should allow column type change

    [ https://issues.apache.org/jira/browse/HIVE-6784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13954289#comment-13954289 ] 

Brock Noland commented on HIVE-6784:
------------------------------------

FYI [~jcoffey] [~xuefuz]

> parquet-hive should allow column type change
> --------------------------------------------
>
>                 Key: HIVE-6784
>                 URL: https://issues.apache.org/jira/browse/HIVE-6784
>             Project: Hive
>          Issue Type: Bug
>          Components: File Formats, Serializers/Deserializers
>    Affects Versions: 0.13.0
>            Reporter: Tongjie Chen
>
> see also in the following parquet issue:
> https://github.com/Parquet/parquet-mr/issues/323
> Currently, if we change parquet format hive table using "alter table parquet_table change c1 c1 bigint " ( assuming original type of c1 is int), it will result in exception thrown from SerDe: "org.apache.hadoop.io.IntWritable cannot be cast to org.apache.hadoop.io.LongWritable" in query runtime.
> This is different behavior from hive (using other file format), where it will try to perform cast (null value in case of incompatible type).



--
This message was sent by Atlassian JIRA
(v6.2#6252)