You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/10/18 20:21:23 UTC

[GitHub] [spark] imback82 commented on a change in pull request #30079: [SPARK-33174][SQL] Migrate DROP TABLE to use UnresolvedTableOrView to resolve the identifier

imback82 commented on a change in pull request #30079:
URL: https://github.com/apache/spark/pull/30079#discussion_r507208340



##########
File path: sql/core/src/test/scala/org/apache/spark/sql/connector/TestV2SessionCatalogBase.scala
##########
@@ -74,8 +82,11 @@ private[connector] trait TestV2SessionCatalogBase[T <: Table] extends Delegating
   }
 
   def clearTables(): Unit = {
-    assert(!tables.isEmpty, "Tables were empty, maybe didn't use the session catalog code path?")

Review comment:
       Now that `withTable` works consistently, this log should be updated.

##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala
##########
@@ -1091,6 +1095,19 @@ class Analyzer(
     }
   }
 
+  /**
+   * Resolve [[UnresolvedTableOrView]] by replacing it with [[NotFoundTableOrView]]
+   * if resolution of table or view is not required.
+   *
+   * This rule should run after [[ResolveRelations]] and [[ResolveTables]] are run.

Review comment:
       I remember we shouldn't have a dependency on other rules. I can put these into one rule if the current approach is not desired.

##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/v2ResolutionPlans.scala
##########
@@ -45,12 +45,25 @@ case class UnresolvedTable(multipartIdentifier: Seq[String]) extends LeafNode {
 /**
  * Holds the name of a table or view that has yet to be looked up in a catalog. It will
  * be resolved to [[ResolvedTable]] or [[ResolvedView]] during analysis.
+ *
+ * If 'isResolutionRequired' is set to false and the name cannot be resolved to a table or view,
+ * [[UnresolvedTableOrView]] will be converted to [[NotFoundTableOrView]].
  */
-case class UnresolvedTableOrView(multipartIdentifier: Seq[String]) extends LeafNode {
+case class UnresolvedTableOrView(
+    multipartIdentifier: Seq[String],
+    isResolutionRequired: Boolean = true) extends LeafNode {
   override lazy val resolved: Boolean = false
   override def output: Seq[Attribute] = Nil
 }
 
+/**
+ * Holds the name of a table or view that has been looked up in a catalog, but not found.
+ * This is a "resolved" logical.
+ */
+case class NotFoundTableOrView(multipartIdentifier: Seq[String]) extends LeafNode {

Review comment:
       @cloud-fan `DROP TABLE` is a bit tricky to handle in the new resolution framework because the command has the "ifExists" option. So, even if a table/view cannot be resolved, we shouldn't fail if `ifExists` is set to true. Please let me know if the current approach is reasonable. Thanks in advance!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org