You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2021/03/16 14:16:00 UTC
[jira] [Updated] (SPARK-34694) Improve Spark SQL Source Filter to
allow pushdown of filters span multiple columns
[ https://issues.apache.org/jira/browse/SPARK-34694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Takeshi Yamamuro updated SPARK-34694:
-------------------------------------
Component/s: (was: Spark Core)
SQL
> Improve Spark SQL Source Filter to allow pushdown of filters span multiple columns
> ----------------------------------------------------------------------------------
>
> Key: SPARK-34694
> URL: https://issues.apache.org/jira/browse/SPARK-34694
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.0.0, 3.0.1, 3.0.2, 3.1.0, 3.1.1
> Reporter: Chen Zou
> Priority: Minor
>
> The current org.apache.spark.sql.sources.Filter abstract class only allows pushdown of filters on single column or sum of products of multiple such single-column filters.
> Filters on multiple columns cannot be pushed down through this Filter subclass to source, e.g. from TPC-H benchmark on lineitem table:
> (l_commitdate#11 < l_receiptdate#12)
> (l_shipdate#10 < l_commitdate#11)
>
> The current design probably originates from the point that columnar source has a hard time supporting these cross-column filters. But with batching implemented in columnar sources, they can still support cross-column filters. This issue tries to open up discussion on a more general Filter interface to allow pushing down cross-column filters.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org