You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2021/04/26 00:51:44 UTC

[GitHub] [iceberg] wypoon commented on a change in pull request #2460: Spark: Upgrade spark.version to 3.1.1

wypoon commented on a change in pull request #2460:
URL: https://github.com/apache/iceberg/pull/2460#discussion_r619911019



##########
File path: spark3/src/main/java/org/apache/iceberg/spark/source/SparkTable.java
##########
@@ -205,6 +205,9 @@ public boolean canDeleteWhere(Filter[] filters) {
     }
 
     Set<Integer> identitySourceIds = table().spec().identitySourceIds();
+    if (identitySourceIds.isEmpty()) {
+      return true;
+    }

Review comment:
       I am not convinced that this change to `canDeleteWhere` is correct.
   @aokolnychyi can you please comment? I know you added `canDeleteWhere` to `SupportsDelete` in Spark (it went into 3.1), and iiuc, some change to rewrite the query if `canDeleteWhere` returns false is needed on the Spark side but was left as to-be-implemented-later?
   This change here causes TestDeleteFrom to pass (it fails with Spark 3.1 otherwise), but in my local testing, when I make this change, TestCopyOnWriteDelete hangs (commits keep failing and retrying indefinitely).




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org