You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/02/10 07:38:54 UTC

[GitHub] [spark] Yaohua628 commented on a change in pull request #35459: [SPARK-38159][SQL] Minor refactor of MetadataAttribute unapply method

Yaohua628 commented on a change in pull request #35459:
URL: https://github.com/apache/spark/pull/35459#discussion_r803377910



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala
##########
@@ -366,9 +364,8 @@ case class FileSourceScanExec(
   @transient
   private lazy val pushedDownFilters = {
     val supportNestedPredicatePushdown = DataSourceUtils.supportNestedPredicatePushdown(relation)
-    // TODO: should be able to push filters containing metadata columns down to skip files
     dataFilters.filterNot(_.references.exists {

Review comment:
       agreed, it is better to remove in `FileSourceStrategy`, but we actually need to use the intact `dataFilters` here in `FileSourceScanExec` to filter files to read (i.e. `listFiles(..., dataFilters)`). 
   
   added some comments here, do you think it is OK for now?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org