You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kousuke Saruta (Jira)" <ji...@apache.org> on 2021/11/08 15:25:00 UTC

[jira] [Resolved] (SPARK-37240) Cannot read partitioned parquet files with ANSI interval partition values

     [ https://issues.apache.org/jira/browse/SPARK-37240?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kousuke Saruta resolved SPARK-37240.
------------------------------------
    Fix Version/s: 3.3.0
       Resolution: Fixed

Issue resolved in https://github.com/apache/spark/pull/34517

> Cannot read partitioned parquet files with ANSI interval partition values
> -------------------------------------------------------------------------
>
>                 Key: SPARK-37240
>                 URL: https://issues.apache.org/jira/browse/SPARK-37240
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Max Gekk
>            Assignee: Max Gekk
>            Priority: Major
>             Fix For: 3.3.0
>
>
> The code below demonstrates the issue:
> {code:scala}
> scala> sql("SELECT INTERVAL '1' YEAR AS i, 0 as id").write.partitionBy("i").parquet("/Users/maximgekk/tmp/ansi_interval_parquet")
> scala> spark.read.schema("i INTERVAL YEAR, id INT").parquet("/Users/maximgekk/tmp/ansi_interval_parquet").show(false)
> 21/11/08 10:56:36 ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 2)
> java.lang.RuntimeException: DataType INTERVAL YEAR is not supported in column vectorized reader.
> 	at org.apache.spark.sql.execution.vectorized.ColumnVectorUtils.populate(ColumnVectorUtils.java:100)
> 	at org.apache.spark.sql.execution.datasources.parquet.VectorizedParquetRecordReader.initBatch(VectorizedParquetRecordReader.java:243)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org