You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Janaki Lahorani (JIRA)" <ji...@apache.org> on 2018/04/03 16:35:00 UTC

[jira] [Assigned] (HIVE-19098) Hive: impossible to insert data in a parquet's table with "union all" in the select query

     [ https://issues.apache.org/jira/browse/HIVE-19098?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Janaki Lahorani reassigned HIVE-19098:
--------------------------------------

    Assignee: Janaki Lahorani

> Hive: impossible to insert data in a parquet's table with "union all" in the select query
> -----------------------------------------------------------------------------------------
>
>                 Key: HIVE-19098
>                 URL: https://issues.apache.org/jira/browse/HIVE-19098
>             Project: Hive
>          Issue Type: Bug
>          Components: Hive
>    Affects Versions: 2.3.2
>            Reporter: ACOSS
>            Assignee: Janaki Lahorani
>            Priority: Minor
>
> Hello
> We have a parquet's table.
> We want to insert data in the table by a querie like this:
> "insert into my_table select * from my_select_table_1 union all select * from my_select_table_2"
> It's fail with the error:
> 2018-04-03 15:49:28,898 FATAL [IPC Server handler 2 on 38465] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1522749003448_0028_m_000000_0 - exited : java.io.IOException: java.lang.reflect.InvocationTargetException
>  at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
>  at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
>  at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271)
>  at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:217)
>  at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345)
>  at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695)
>  at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
>  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)
> Caused by: java.lang.reflect.InvocationTargetException
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>  at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257)
>  ... 11 more
> Caused by: java.lang.NullPointerException
>  at java.util.AbstractCollection.addAll(AbstractCollection.java:343)
>  at org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:118)
>  at org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:189)
>  at org.apache.hadoop.hive.ql.io.parquet.ParquetRecordReaderBase.getSplit(ParquetRecordReaderBase.java:75)
>  at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:75)
>  at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:60)
>  at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75)
>  at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:99)
>  ... 16 more
>  
> Scenario:
> create table t1 (col1 string);
> create table t2 (col1 string);
> insert into t2 values ('2017');
> insert into t1 values ('2017');
> create table t3 (col1 string) STORED AS PARQUETFILE;
>  INSERT into t3 select col1 from t1 union all select col1 from t2; 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)