You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/18 04:28:24 UTC

[GitHub] [spark] cloud-fan commented on a diff in pull request #37488: [WIP][SPARK-40055][SQL] listCatalogs should also return spark_catalog even when spark_catalog implementation is defaultSessionCatalog

cloud-fan commented on code in PR #37488:
URL: https://github.com/apache/spark/pull/37488#discussion_r948638236


##########
sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala:
##########
@@ -723,8 +723,12 @@ class DataSourceV2SQLSuite
     df.createOrReplaceTempView("source")
 
     sql(s"CREATE TABLE table_name USING parquet AS SELECT id, data FROM source")
-
-    checkAnswer(sql(s"TABLE default.table_name"), spark.table("source"))
+    val x = spark.table("source")
+    val xr = x.collect()
+    val y = sql(s"TABLE default.table_name")
+    val xy = y.collect()

Review Comment:
   I think it's a test problem. `spark.conf.unset(V2_SESSION_CATALOG_IMPLEMENTATION.key)` doesn't work anymore as the session catalog instance is in the map now. We should call `CatalogManager.reset` after `spark.conf.unset(V2_SESSION_CATALOG_IMPLEMENTATION.key)`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org