You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Max Gekk (Jira)" <ji...@apache.org> on 2021/09/29 06:58:00 UTC
[jira] [Created] (SPARK-36889) Respect
`spark.sql.parquet.filterPushdown` by explain() for DSv2
Max Gekk created SPARK-36889:
--------------------------------
Summary: Respect `spark.sql.parquet.filterPushdown` by explain() for DSv2
Key: SPARK-36889
URL: https://issues.apache.org/jira/browse/SPARK-36889
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 3.3.0
Reporter: Max Gekk
Assignee: Max Gekk
When filters pushdown for parquet is disabled via the SQL config spark.sql.parquet.filterPushdown, explain() still outputs pushed down filters:
{code}
== Parsed Logical Plan ==
'Filter ('c0 = 1)
+- RelationV2[c0#7] parquet file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff7e9a24-fd4e-4981-9c75-e1bcde78e91a
== Analyzed Logical Plan ==
c0: int
Filter (c0#7 = 1)
+- RelationV2[c0#7] parquet file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff7e9a24-fd4e-4981-9c75-e1bcde78e91a
== Optimized Logical Plan ==
Filter (isnotnull(c0#7) AND (c0#7 = 1))
+- RelationV2[c0#7] parquet file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff7e9a24-fd4e-4981-9c75-e1bcde78e91a
== Physical Plan ==
*(1) Filter (isnotnull(c0#7) AND (c0#7 = 1))
+- *(1) ColumnarToRow
+- BatchScan[c0#7] ParquetScan DataFilters: [isnotnull(c0#7), (c0#7 = 1)], Format: parquet, Location: InMemoryFileIndex(1 paths)[file:/private/var/folders/p3/dfs6mf655d7fnjrsjvldh0tc0000gn/T/spark-ff..., PartitionFilters: [], PushedFilters: [IsNotNull(c0), EqualTo(c0,1)], ReadSchema: struct<c0:int>, PushedFilters: [IsNotNull(c0), EqualTo(c0,1)] RuntimeFilters: []
{code}
See PushedFilters: [IsNotNull(c0), EqualTo(c0,1)]
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org