You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/05 02:48:55 UTC

[GitHub] [spark] cloud-fan opened a new pull request, #37415: [DO NOT MERGE]

cloud-fan opened a new pull request, #37415:
URL: https://github.com/apache/spark/pull/37415

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   If benchmark tests were added, please run the benchmarks in GitHub Actions for the consistent environment, and the instructions could accord to: https://spark.apache.org/developer-tools.html#github-workflow-benchmarks.
   -->
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941444294


##########
sql/hive/src/test/scala/org/apache/spark/sql/hive/MetastoreDataSourcesSuite.scala:
##########
@@ -1251,7 +1251,7 @@ class MetastoreDataSourcesSuite extends QueryTest with SQLTestUtils with TestHiv
       e = intercept[AnalysisException] {
         table(tableName).write.mode(SaveMode.ErrorIfExists).saveAsTable(tableName)
       }.getMessage
-      assert(e.contains(s"Table `$tableName` already exists"))
+      assert(e.contains(s"Table `$SESSION_CATALOG_NAME`.`default`.`$tableName` already exists"))

Review Comment:
   https://github.com/apache/spark/pull/37021 missed this case



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on PR #37415:
URL: https://github.com/apache/spark/pull/37415#issuecomment-1210179993

   cc @viirya @MaxGekk 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941444715


##########
sql/hive/src/test/scala/org/apache/spark/sql/sources/HadoopFsRelationTest.scala:
##########
@@ -384,7 +385,7 @@ abstract class HadoopFsRelationTest extends QueryTest with SQLTestUtils with Tes
       val msg = intercept[AnalysisException] {
         testDF.write.format(dataSourceName).mode(SaveMode.ErrorIfExists).saveAsTable("t")
       }.getMessage
-      assert(msg.contains("Table `t` already exists"))
+      assert(msg.contains(s"Table `$SESSION_CATALOG_NAME`.`default`.`t` already exists"))

Review Comment:
   https://github.com/apache/spark/pull/37021 missed this case



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r942000717


##########
sql/core/src/test/resources/sql-tests/results/explain-aqe.sql.out:
##########
@@ -909,7 +909,7 @@ Execute CreateViewCommand (1)
 Output: []
 
 (2) CreateViewCommand
-Arguments: `spark_catalog`.`default`.`explain_view`, SELECT key, val FROM explain_temp1, false, false, PersistedView, true

Review Comment:
   we shouldn't attach v1 catalog name here as `CreateViewCommand` can create temp views.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941435240


##########
sql/core/src/test/resources/sql-tests/results/describe.sql.out:
##########
@@ -552,7 +552,7 @@ struct<plan:string>
 -- !query output
 == Physical Plan ==
 Execute DescribeTableCommand
-   +- DescribeTableCommand `spark_catalog`.`default`.`t`, false, [col_name#x, data_type#x, comment#x]
+   +- DescribeTableCommand `default`.`t`, false, [col_name#x, data_type#x, comment#x]

Review Comment:
   This reverts back to the behavior before https://github.com/apache/spark/pull/37021 . We only want to attach the v1 catalog name in some command output and error messages. Qualifying the identifiers in every v1 command is a different story: it's non-trivial as we need to follow the v2 command framework and look up the identifiers to see if it's a temp view or persistent tables.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r942553748


##########
sql/core/src/test/resources/sql-tests/results/explain-aqe.sql.out:
##########
@@ -909,7 +909,7 @@ Execute CreateViewCommand (1)
 Output: []
 
 (2) CreateViewCommand
-Arguments: `spark_catalog`.`default`.`explain_view`, SELECT key, val FROM explain_temp1, false, false, PersistedView, true

Review Comment:
   good point.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r942000918


##########
sql/core/src/test/resources/sql-tests/results/show-create-table.sql.out:
##########
@@ -294,7 +294,7 @@ SHOW CREATE TABLE view_SPARK_30302 AS SERDE
 -- !query schema
 struct<createtab_stmt:string>
 -- !query output
-CREATE VIEW default.view_SPARK_30302 (
+CREATE VIEW default.view_spark_30302 (

Review Comment:
   This is the actual table name stored in the catalog.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r943101024


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala:
##########
@@ -149,17 +148,48 @@ class SessionCatalog(
   }
 
   /**
-   * Format table name, taking into account case sensitivity.
+   * Formats object names, taking into account case sensitivity.
    */
-  protected[this] def formatTableName(name: String): String = {
+  protected def format(name: String): String = {

Review Comment:
   It was `protected` before so I keep it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] MaxGekk commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
MaxGekk commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r943084330


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala:
##########
@@ -149,17 +148,48 @@ class SessionCatalog(
   }
 
   /**
-   * Format table name, taking into account case sensitivity.
+   * Formats object names, taking into account case sensitivity.
    */
-  protected[this] def formatTableName(name: String): String = {
+  protected def format(name: String): String = {

Review Comment:
   nit: It can be even `private` since it is used in `SessionCatalog` only.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on PR #37415:
URL: https://github.com/apache/spark/pull/37415#issuecomment-1210081625

   cc @ulysses-you 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941439129


##########
sql/core/src/test/resources/sql-tests/results/udaf.sql.out:
##########
@@ -31,7 +31,7 @@ SELECT default.myDoubleAvg(int_col1, 3) as my_avg from t1
 struct<>
 -- !query output
 org.apache.spark.sql.AnalysisException
-Invalid number of arguments for function spark_catalog.default.myDoubleAvg. Expected: 1; Found: 2; line 1 pos 7
+Invalid number of arguments for function spark_catalog.default.mydoubleavg. Expected: 1; Found: 2; line 1 pos 7

Review Comment:
   This was a test problem before: we forgot to lower-case the function name when creating functions, and the testing in-memory catalog is case-preserving. This is inconsistent with table creation and should be fixed. Note that the production behavior (with Hive Metastore) is not changed: Hive Metastore is not case-preserving.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941435240


##########
sql/core/src/test/resources/sql-tests/results/describe.sql.out:
##########
@@ -552,7 +552,7 @@ struct<plan:string>
 -- !query output
 == Physical Plan ==
 Execute DescribeTableCommand
-   +- DescribeTableCommand `spark_catalog`.`default`.`t`, false, [col_name#x, data_type#x, comment#x]
+   +- DescribeTableCommand `default`.`t`, false, [col_name#x, data_type#x, comment#x]

Review Comment:
   This reverts back to the behavior before https://github.com/apache/spark/pull/37021 . We only want to attach the v1 catalog name in some command output and error messages. Qualifying the identifiers in every v1 command is a different story: it's non-trivial as we need to follow the v2 command framework and look up the identifiers to see if it's a temp view or persistent tables.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on PR #37415:
URL: https://github.com/apache/spark/pull/37415#issuecomment-1211560698

   thanks for the review, merging to master!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941443825


##########
sql/hive/src/test/scala/org/apache/spark/sql/hive/MetastoreDataSourcesSuite.scala:
##########
@@ -336,7 +336,7 @@ class MetastoreDataSourcesSuite extends QueryTest with SQLTestUtils with TestHiv
         }.getMessage
 
         assert(
-          message.contains(s"Table $SESSION_CATALOG_NAME.default.ctasJsonTable already exists."),
+          message.contains(s"Table $SESSION_CATALOG_NAME.default.ctasjsontable already exists."),

Review Comment:
   It's better to use the actual table name, which is always lower-cased because of Hive Metastore.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] viirya commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
viirya commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r943092133


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/identifiers.scala:
##########
@@ -113,6 +88,7 @@ object AliasIdentifier {
  */
 case class TableIdentifier(table: String, database: Option[String], catalog: Option[String])
   extends CatalystIdentifier {
+  assert(catalog.isEmpty || database.isDefined)

Review Comment:
   This requires `database` must be defined if `catalog` is defined?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [DO NOT MERGE]

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941392655


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala:
##########
@@ -345,9 +372,9 @@ class SessionCatalog(
       throw QueryCompilationErrors.createExternalTableWithoutLocationError
     }
 
-    val db = formatDatabaseName(tableDefinition.identifier.database.getOrElse(getCurrentDatabase))
-    val table = formatTableName(tableDefinition.identifier.table)
-    val tableIdentifier = attachSessionCatalog(TableIdentifier(table, Some(db)))
+    val qualifiedIdent = qualifyIdentifier(tableDefinition.identifier)
+    val db = qualifiedIdent.database.get
+    val table = qualifiedIdent.table

Review Comment:
   Most of the changes in this file are similar to this: Previously we get the current database and format the names to build a `TableIdentifier`. Now we qualify the input identifier and get table/database name from the qualified identifier.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan closed pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan closed pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog
URL: https://github.com/apache/spark/pull/37415


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941442157


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala:
##########
@@ -783,7 +784,7 @@ abstract class DDLSuite extends QueryTest with DDLSuiteBase {
         sql("ALTER TABLE tab1 RENAME TO default.tab2")
       }
       assert(e.getMessage.contains(
-        s"RENAME TEMPORARY VIEW from '`tab1`' to '`$SESSION_CATALOG_NAME`.`default`.`tab2`': " +
+        s"RENAME TEMPORARY VIEW from '`tab1`' to '`default`.`tab2`': " +

Review Comment:
   This reverts back to the behavior before https://github.com/apache/spark/pull/37021 : we shouldn't blindly attach catalog names in every v1 command. In this case we shouldn't mention `SESSION_CATALOG_NAME` because temp views do not belong to any catalog.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r941442898


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/command/v1/ShowFunctionsSuite.scala:
##########
@@ -37,6 +39,10 @@ trait ShowFunctionsSuiteBase extends command.ShowFunctionsSuiteBase
   override protected def dropFunction(name: String): Unit = {
     sql(s"DROP FUNCTION IF EXISTS $name")
   }
+  override protected def qualifiedFunName(ns: String, name: String): String = {
+    // `SessionCatalog` lower-cases function names before creating.

Review Comment:
   ditto https://github.com/apache/spark/pull/37415/files#r941439129



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] ulysses-you commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
ulysses-you commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r942169914


##########
sql/core/src/test/resources/sql-tests/results/explain-aqe.sql.out:
##########
@@ -909,7 +909,7 @@ Execute CreateViewCommand (1)
 Output: []
 
 (2) CreateViewCommand
-Arguments: `spark_catalog`.`default`.`explain_view`, SELECT key, val FROM explain_temp1, false, false, PersistedView, true

Review Comment:
   it seems we can distinguish if the target view type of CreateViewCommand is temp ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #37415: [SPARK-40020][SQL] Centralize the code of qualifying identifiers in SessionCatalog

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on code in PR #37415:
URL: https://github.com/apache/spark/pull/37415#discussion_r943101147


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/identifiers.scala:
##########
@@ -113,6 +88,7 @@ object AliasIdentifier {
  */
 case class TableIdentifier(table: String, database: Option[String], catalog: Option[String])
   extends CatalystIdentifier {
+  assert(catalog.isEmpty || database.isDefined)

Review Comment:
   yea, it's super weird of an identifier has catalog field but not database.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org