You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Chen Zou (Jira)" <ji...@apache.org> on 2021/03/15 15:23:00 UTC

[jira] [Comment Edited] (SPARK-34694) Improve Spark SQL Source Filter to allow pushdown of filters span multiple columns

    [ https://issues.apache.org/jira/browse/SPARK-34694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17301701#comment-17301701 ] 

Chen Zou edited comment on SPARK-34694 at 3/15/21, 3:22 PM:
------------------------------------------------------------

Hi Hyukjin,

I think the design you described would work.

But the current org.apache.spark.sql.sources.Filter isn't built under the assumption that the 'value' parameter could be a column reference.

e.g. the findReferences member function does not consider value being a column references.
{quote}  protected def findReferences(value: Any): Array[String] = value match {
    case f: Filter => f.references
    case _ => Array.empty
  }
 {quote}
 

And this is probably why org.apache.spark.sql.execution.datasources.v2.PushDownUtils would not push the cross-column filters down to data sources.

The end result is that cross-column filters don't get pushed down, from stderr of a spark job doing TPC-H Q12:

21/03/10 16:56:16.266 INFO V2ScanRelationPushDown: 
 Pushing operators to lineitem@[file:///home/colouser51/udpstorage/tpch/tbl_s1e1/lineitem]
 Pushed Filters: Or(EqualTo(l_shipmode,MAIL),EqualTo(l_shipmode,SHIP)), GreaterThanOrEqual(l_receiptdate,1994-01-01), LessThan(l_receiptdate,1995-01-01)
 Post-Scan Filters: (l_commitdate#11 < l_receiptdate#12),(l_shipdate#10 < l_commitdate#11)
 Output: l_orderkey#0, l_shipdate#10, l_commitdate#11, l_receiptdate#12, l_shipmode#14

 

Regards,
 Chen


was (Author: zinechant):
Hi Hyukjin,

I think the design you described would work.

But the current org.apache.spark.sql.sources.Filter isn't built under the assumption that the 'value' parameter could be a column reference.

e.g. the findReferences member function does not consider value being a column references.
{quote}  protected def findReferences(value: Any): Array[String] = value match {
    case f: Filter => f.references
    case _ => Array.empty
  }{quote}
 

And this is probably why org.apache.spark.sql.execution.datasources.v2.PushDownUtils would not push the cross-column filters down to data sources.

The end result is that cross-column filters don't get pushed down, from stderr of a spark job doing TPC-H Q12:

21/03/10 16:56:16.266 INFO V2ScanRelationPushDown: 
Pushing operators to lineitem@file:///home/colouser51/udpstorage/tpch/tbl_s1e1/lineitem
Pushed Filters: Or(EqualTo(l_shipmode,MAIL),EqualTo(l_shipmode,SHIP)), GreaterThanOrEqual(l_receiptdate,1994-01-01), LessThan(l_receiptdate,1995-01-01)
Post-Scan Filters: (l_commitdate#11 < l_receiptdate#12),(l_shipdate#10 < l_commitdate#11)
Output: l_orderkey#0, l_shipdate#10, l_commitdate#11, l_receiptdate#12, l_shipmode#14

 

Regards,
Chen

> Improve Spark SQL Source Filter to allow pushdown of filters span multiple columns
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-34694
>                 URL: https://issues.apache.org/jira/browse/SPARK-34694
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.0.0, 3.0.1, 3.0.2, 3.1.0, 3.1.1
>            Reporter: Chen Zou
>            Priority: Minor
>
> The current org.apache.spark.sql.sources.Filter abstract class only allows pushdown of filters on single column or sum of products of multiple such single-column filters.
> Filters on multiple columns cannot be pushed down through this Filter subclass to source, e.g. from TPC-H benchmark on lineitem table:
> (l_commitdate#11 < l_receiptdate#12)
> (l_shipdate#10 < l_commitdate#11)
>  
> The current design probably originates from the point that columnar source has a hard time supporting these cross-column filters. But with batching implemented in columnar sources, they can still support cross-column filters.  This issue tries to open up discussion on a more general Filter interface to allow pushing down cross-column filters.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org