You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/03/15 04:24:00 UTC
[jira] [Commented] (SPARK-34694) Improve Spark SQL Source Filter to
allow pushdown of filters span multiple columns
[ https://issues.apache.org/jira/browse/SPARK-34694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17301375#comment-17301375 ]
Hyukjin Kwon commented on SPARK-34694:
--------------------------------------
{code}
(l_commitdate#11 < l_receiptdate#12)
(l_shipdate#10 < l_commitdate#11)
{code}
will be
{{And(LessThen(l_commitdate, l_receiptdate), LessThen(l_shipdate, l_commitdate))}}. I think this seems fine. Do you mind elabourating which kind of design you have in mind?
> Improve Spark SQL Source Filter to allow pushdown of filters span multiple columns
> ----------------------------------------------------------------------------------
>
> Key: SPARK-34694
> URL: https://issues.apache.org/jira/browse/SPARK-34694
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 3.0.0, 3.0.1, 3.0.2, 3.1.0, 3.1.1
> Reporter: Chen Zou
> Priority: Minor
>
> The current org.apache.spark.sql.sources.Filter abstract class only allows pushdown of filters on single column or sum of products of multiple such single-column filters.
> Filters on multiple columns cannot be pushed down through this Filter subclass to source, e.g. from TPC-H benchmark on lineitem table:
> (l_commitdate#11 < l_receiptdate#12)
> (l_shipdate#10 < l_commitdate#11)
>
> The current design probably originates from the point that columnar source has a hard time supporting these cross-column filters. But with batching implemented in columnar sources, they can still support cross-column filters. This issue tries to open up discussion on a more general Filter interface to allow pushing down cross-column filters.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org