You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/09/21 20:25:31 UTC

[GitHub] [spark] imback82 commented on a change in pull request #29811: [SPARK-32959][SQL][TEST] Fix an invalid test in DataSourceV2SQLSuite

imback82 commented on a change in pull request #29811:
URL: https://github.com/apache/spark/pull/29811#discussion_r491597496



##########
File path: sql/core/src/test/scala/org/apache/spark/sql/connector/TestV2SessionCatalogBase.scala
##########
@@ -47,10 +47,13 @@ private[connector] trait TestV2SessionCatalogBase[T <: Table] extends Delegating
       tables.get(ident)
     } else {
       // Table was created through the built-in catalog
-      val t = super.loadTable(ident)
-      val table = newTable(t.name(), t.schema(), t.partitioning(), t.properties())
-      tables.put(ident, table)
-      table
+      super.loadTable(ident) match {
+        case v1Table: V1Table if v1Table.v1Table.tableType == CatalogTableType.VIEW => v1Table
+        case t =>
+          val table = newTable(t.name(), t.schema(), t.partitioning(), t.properties())
+          tables.put(ident, table)
+          table
+      }

Review comment:
       Now that the test runs, it actually fails because this block remove `V1Table` info, which is needed in the following:
   https://github.com/apache/spark/blob/f1dc479d39a6f05df7155008d8ec26dff42bb06c/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala#L1008




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org