You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@parquet.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/08/01 08:53:00 UTC

[jira] [Commented] (PARQUET-1073) Hive failed to parse Parquet file generated by Spark SQL

    [ https://issues.apache.org/jira/browse/PARQUET-1073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16108588#comment-16108588 ] 

Hyukjin Kwon commented on PARQUET-1073:
---------------------------------------

Hi [~junjie], I just happened to look at this one while just looking through Parquet JIRAs out of curiosity.
My guess is that it is not a Parquet specific problem. Can you try to write out after enabling {{spark.sql.parquet.writeLegacyFormat}} in Spark and read it back? 

> Hive failed to parse Parquet file generated by Spark SQL
> --------------------------------------------------------
>
>                 Key: PARQUET-1073
>                 URL: https://issues.apache.org/jira/browse/PARQUET-1073
>             Project: Parquet
>          Issue Type: Bug
>          Components: parquet-mr
>    Affects Versions: 1.8.1
>         Environment: HIVE-2.3.0-SNAPSHOT
> SPARK-2.1.0
> PARQUET-MR 1.8.1
>            Reporter: Junjie Chen
>            Priority: Minor
>
> When load parquet file which generated from sparksql using following SQL:
> CREATE EXTERNAL TABLE IF NOT EXISTS sparksql_tbl
> (
> ...
> )
>   STORED AS PARQUET
>   LOCATION '/root/spark-warehouse/sparksql_db.db/sparksql_tbl/'
> parquet-mr throw following exception:
> Diagnostic Messages for this Task:
> Error: java.io.IOException: java.lang.reflect.InvocationTargetException
>         at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
>         at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
>         at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271)
>         at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:217)
>         at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345)
>         at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695)
>         at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257)
>         ... 11 more
> Caused by: org.apache.parquet.io.ParquetDecodingException: Can not read value at 0 in block -1 in file hdfs://bdpe30:9001/root/SQLDataGen/spark-warehouse/sparksql_db.db/sparksql_tbl/part-r-00001-d9a4d43a-e134-4a04-97d4-268dabe26078.parquet
>         at org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue(InternalParquetRecordReader.java:222)
>         at org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue(ParquetRecordReader.java:217)
>         at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:98)
>         at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>(ParquetRecordReaderWrapper.java:59)
>         at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75)
>         at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:99)
>         ... 16 more
> Caused by: java.lang.UnsupportedOperationException: org.apache.parquet.column.values.dictionary.PlainValuesDictionary$PlainIntegerDictionary
>         at org.apache.parquet.column.Dictionary.decodeToBinary(Dictionary.java:44)
>         at org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter$BinaryConverter.setDictionary(ETypeConverter.java:291)
>         at org.apache.parquet.column.impl.ColumnReaderImpl.<init>(ColumnReaderImpl.java:346)
>         at org.apache.parquet.column.impl.ColumnReadStoreImpl.newMemColumnReader(ColumnReadStoreImpl.java:82)
>         at org.apache.parquet.column.impl.ColumnReadStoreImpl.getColumnReader(ColumnReadStoreImpl.java:77)
>         at org.apache.parquet.io.RecordReaderImplementation.<init>(RecordReaderImplementation.java:270)
>         at org.apache.parquet.io.MessageColumnIO$1.visit(MessageColumnIO.java:144)
>         at org.apache.parquet.io.MessageColumnIO$1.visit(MessageColumnIO.java:106)
>         at org.apache.parquet.filter2.compat.FilterCompat$NoOpFilter.accept(FilterCompat.java:154)
>         at org.apache.parquet.io.MessageColumnIO.getRecordReader(MessageColumnIO.java:106)
>         at org.apache.parquet.hadoop.InternalParquetRecordReader.checkRead(InternalParquetRecordReader.java:136)
>         at org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue(InternalParquetRecordReader.java:193)
>         ... 21 more
> Container killed by the ApplicationMaster.
> Container killed on request. Exit code is 143
> Container exited with a non-zero exit code 143
> FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)