You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "chengkun jia (JIRA)" <ji...@apache.org> on 2019/04/09 06:53:00 UTC

[jira] [Commented] (HIVE-21185) insert overwrite directory ... stored as nontextfile raise exception with merge files open

    [ https://issues.apache.org/jira/browse/HIVE-21185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16813048#comment-16813048 ] 

chengkun jia commented on HIVE-21185:
-------------------------------------

I think this issue is resolved in https://issues.apache.org/jira/browse/HIVE-18833

that's what i just wanted.

 

> insert overwrite directory ... stored as nontextfile raise exception with merge files open
> ------------------------------------------------------------------------------------------
>
>                 Key: HIVE-21185
>                 URL: https://issues.apache.org/jira/browse/HIVE-21185
>             Project: Hive
>          Issue Type: Bug
>          Components: Query Planning
>    Affects Versions: 2.1.1, 2.3.0
>            Reporter: chengkun jia
>            Priority: Major
>
> reproduce:
>  
> {code:java}
> # init table with small files
> create table multiple_small_files (id int);
> insert into multiple_small_files values (1);
> insert into multiple_small_files values (1);
> insert into multiple_small_files values (1);
> insert into multiple_small_files values (1);
> insert into multiple_small_files values (1);
> insert into multiple_small_files values (1);
> insert into multiple_small_files values (1);
> insert into multiple_small_files values (1);
> # open small file merge
> set hive.merge.mapfiles=true;
> set hive.merge.mapredfiles=true;
> insert overwrite directory '/path/to/hdfs' stored as avro
> select * from multiple_small_files;
> {code}
> this will produce exception like:
> {code:java}
> Messages for this Task:Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable Objavro.schema�{"type":"record","name":"baseRecord","fields":[{"name":"_col0","type":["null","int"],"default":null}]}�$$����N���e(���                                                             �$$����N���e(��� at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:169) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable Objavro.schema�{"type":"record","name":"baseRecord","fields":[{"name":"_col0","type":["null","int"],"default":null}]}�$$����N���e(���                                     �$$����N���e(��� at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160) ... 8 moreCaused by: org.apache.hadoop.hive.serde2.avro.AvroSerdeException: Expecting a AvroGenericRecordWritable at org.apache.hadoop.hive.serde2.avro.AvroDeserializer.deserialize(AvroDeserializer.java:139) at org.apache.hadoop.hive.serde2.avro.AvroSerDe.deserialize(AvroSerDe.java:216) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.readRow(MapOperator.java:128) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.access$200(MapOperator.java:92) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:488) ... 9 moreFAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> {code}
>  
> This issue not only affect avrofile format but all nontextfile storage format. The rootcause is hive get wrong input format in file merge stage



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)