You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/09/17 04:55:00 UTC
[jira] [Commented] (SPARK-36776) Partition filter of
DataSourceV2ScanRelation can not push down when select none dataSchema from
FileScan
[ https://issues.apache.org/jira/browse/SPARK-36776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17416451#comment-17416451 ]
Hyukjin Kwon commented on SPARK-36776:
--------------------------------------
I think this is fixed at SPARK-36351. cc [~huaxingao] FYI
> Partition filter of DataSourceV2ScanRelation can not push down when select none dataSchema from FileScan
> --------------------------------------------------------------------------------------------------------
>
> Key: SPARK-36776
> URL: https://issues.apache.org/jira/browse/SPARK-36776
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.1.2
> Reporter: suheng.cloud
> Priority: Major
>
> In PruneFileSourcePartitions rule, the FileScan::withFilters is called to push down partition prune filter(and this is the only place this function can be called), but it has a constraint that “scan.readDataSchema.nonEmpty”
> [source code here|https://github.com/apache/spark/blob/de351e30a90dd988b133b3d00fa6218bfcaba8b8/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PruneFileSourcePartitions.scala#L114]
> We use spark sql in custom catalog and execute the count sql like: select count( * ) from catalog.db.tbl where dt=‘0812’ (also in other sqls if we not select any col reference to tbl), in which dt is a partition key.
> In this case the scan.readDataSchema is empty indeed and no scan partition prune performed, which cause scan all partition at last.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org