You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/09/28 09:31:00 UTC

[jira] [Commented] (SPARK-36866) Pushdown filters with ANSI interval values to parquet

    [ https://issues.apache.org/jira/browse/SPARK-36866?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17421277#comment-17421277 ] 

Apache Spark commented on SPARK-36866:
--------------------------------------

User 'MaxGekk' has created a pull request for this issue:
https://github.com/apache/spark/pull/34115

> Pushdown filters with ANSI interval values to parquet
> -----------------------------------------------------
>
>                 Key: SPARK-36866
>                 URL: https://issues.apache.org/jira/browse/SPARK-36866
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Max Gekk
>            Assignee: Max Gekk
>            Priority: Major
>
> Spark doesn't push down filters with ANSI intervals to the Parquet datasource:
> {code:scala}
> scala> val df = Seq(Period.ofMonths(-1), Period.ofMonths(1)).toDF("i")
> df: org.apache.spark.sql.DataFrame = [i: interval year to month]
> scala> df.write.parquet("/Users/maximgekk/tmp/parquet_filter")
> scala> val readback = spark.read.parquet("/Users/maximgekk/tmp/parquet_filter")
> readback: org.apache.spark.sql.DataFrame = [i: interval year to month]
> scala> readback.explain(true)
> ...
> == Physical Plan ==
> *(1) ColumnarToRow
> +- FileScan parquet [i#11] Batched: true, DataFilters: [], Format: Parquet, Location: InMemoryFileIndex(1 paths)[file:/Users/maximgekk/tmp/parquet_filter], PartitionFilters: [], PushedFilters: [], ReadSchema: struct<i:interval year to month>
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org