You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Liang-Chi Hsieh (JIRA)" <ji...@apache.org> on 2015/07/26 13:10:04 UTC
[jira] [Commented] (SPARK-9340) ParquetTypeConverter incorrectly
handling of repeated types results in schema mismatch
[ https://issues.apache.org/jira/browse/SPARK-9340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14641916#comment-14641916 ]
Liang-Chi Hsieh commented on SPARK-9340:
----------------------------------------
Your test will cause org.apache.spark.sql.AnalysisException: REPEATED not supported outside LIST or MAP.
I think that repeated type is only supported in a LIST or MAP. You can check CatalystSchemaConverter.convert.
> ParquetTypeConverter incorrectly handling of repeated types results in schema mismatch
> --------------------------------------------------------------------------------------
>
> Key: SPARK-9340
> URL: https://issues.apache.org/jira/browse/SPARK-9340
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.2.0, 1.4.0
> Reporter: Damian Guy
> Attachments: ParquetTypesConverterTest.scala
>
>
> The way ParquetTypesConverter handles primitive repeated types results in an incompatible schema being used for querying data. For example, given a schema like so:
> message root {
> repeated int32 repeated_field;
> }
> Spark produces a read schema like:
> message root {
> optional int32 repeated_field;
> }
> These are incompatible and all attempts to read fail.
> In ParquetTypesConverter.toDataType:
> if (parquetType.isPrimitive) {
> toPrimitiveDataType(parquetType.asPrimitiveType, isBinaryAsString, isInt96AsTimestamp)
> } else {...}
> The if condition should also have !parquetType.isRepetition(Repetition.REPEATED)
>
> And then this case will need to be handled in the else
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org