You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (Jira)" <ji...@apache.org> on 2020/05/25 08:03:00 UTC

[jira] [Created] (SPARK-31811) Pushdown IsNotNull to file scan if possible

Yuming Wang created SPARK-31811:
-----------------------------------

             Summary: Pushdown IsNotNull to file scan if possible
                 Key: SPARK-31811
                 URL: https://issues.apache.org/jira/browse/SPARK-31811
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.1.0
            Reporter: Yuming Wang
            Assignee: Yuming Wang


We should Pushdown {{IsNotNull}} to file scan if possible. For example:
{code:sql}
CREATE TABLE t1(c1 string, c2 string) USING parquet;
EXPLAIN SELECT t1.* FROM t1 WHERE coalesce(t1.c1, t1.c2) IS NOT NULL;
{code}

{noformat}
== Physical Plan ==
*(1) Filter isnotnull(coalesce(c1#43, c2#44))
+- *(1) ColumnarToRow
   +- FileScan parquet default.t1[c1#43,c2#44] Batched: true, DataFilters: [isnotnull(coalesce(c1#43, c2#44))], Format: Parquet, Location: InMemoryFileIndex[file:/root/spark-3.0.0-bin-hadoop2.7/spark-warehouse/t1], PartitionFilters: [], PushedFilters: [], ReadSchema: struct<c1:string,c2:string>
{noformat}

{code:sql}
EXPLAIN SELECT t1.* FROM t1 WHERE t1.c1 IS NOT NULL OR t1.c2 IS NOT NULL;
{code}
{noformat}
== Physical Plan ==
*(1) Filter (isnotnull(c1#43) OR isnotnull(c2#44))
+- *(1) ColumnarToRow
   +- FileScan parquet default.t1[c1#43,c2#44] Batched: true, DataFilters: [(isnotnull(c1#43) OR isnotnull(c2#44))], Format: Parquet, Location: InMemoryFileIndex[file:/root/spark-3.0.0-bin-hadoop2.7/spark-warehouse/t1], PartitionFilters: [], PushedFilters: [Or(IsNotNull(c1),IsNotNull(c2))], ReadSchema: struct<c1:string,c2:string>
{noformat}


Real performance test case:



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org