You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2023/01/05 00:24:20 UTC

[GitHub] [iceberg] aokolnychyi commented on a diff in pull request #6524: Spark 3.3: Discard filters that can be pushed down completely

aokolnychyi commented on code in PR #6524:
URL: https://github.com/apache/iceberg/pull/6524#discussion_r1061989902


##########
spark/v3.3/spark/src/main/java/org/apache/iceberg/spark/source/SparkScanBuilder.java:
##########
@@ -106,41 +106,50 @@ public SparkScanBuilder caseSensitive(boolean isCaseSensitive) {
   @Override
   public Filter[] pushFilters(Filter[] filters) {
     List<Expression> expressions = Lists.newArrayListWithExpectedSize(filters.length);
-    List<Filter> pushed = Lists.newArrayListWithExpectedSize(filters.length);
+    List<Filter> pushableFilters = Lists.newArrayListWithExpectedSize(filters.length);
+    List<Filter> postScanFilters = Lists.newArrayListWithExpectedSize(filters.length);
 
     for (Filter filter : filters) {
-      Expression expr = null;
-      try {
-        expr = SparkFilters.convert(filter);
-      } catch (IllegalArgumentException e) {
-        // converting to Iceberg Expression failed, so this expression cannot be pushed down
-        LOG.info(
-            "Failed to convert filter to Iceberg expression, skipping push down for this expression: {}. {}",
-            filter,
-            e.getMessage());
-      }
+      Expression expr = safelyConvertFilter(filter);
 
       if (expr != null) {
-        try {
-          Binder.bind(schema.asStruct(), expr, caseSensitive);
-          expressions.add(expr);
-          pushed.add(filter);
-        } catch (ValidationException e) {
-          // binding to the table schema failed, so this expression cannot be pushed down
-          LOG.info(
-              "Failed to bind expression to table schema, skipping push down for this expression: {}. {}",
-              filter,
-              e.getMessage());
-        }
+        expressions.add(expr);
+        pushableFilters.add(filter);
+      }
+
+      if (expr == null || requiresRecordLevelFiltering(expr)) {
+        postScanFilters.add(filter);
       }
     }
 
     this.filterExpressions = expressions;
-    this.pushedFilters = pushed.toArray(new Filter[0]);
+    this.pushedFilters = pushableFilters.toArray(new Filter[0]);
+
+    // all unsupported filters and filters that require record-level filtering
+    // must be reported back and handled on the Spark side
+    return postScanFilters.toArray(new Filter[0]);
+  }
+
+  private Expression safelyConvertFilter(Filter filter) {
+    try {
+      Expression expr = SparkFilters.convert(filter);
+
+      if (expr != null) {
+        // try binding the expression to ensure it can be pushed down
+        Binder.bind(schema.asStruct(), expr, caseSensitive);
+        return expr;
+      }
+
+    } catch (Exception e) {

Review Comment:
   I decided to go with a single try-catch to simplify the implementation, instead of having separate blocks for conversion and binding. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org