You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/10/29 01:47:15 UTC

[GitHub] [spark] imback82 commented on a change in pull request #34429: [WIP][SPARK-37150][SQL] Migrate DESCRIBE NAMESPACE to use V2 command by default

imback82 commented on a change in pull request #34429:
URL: https://github.com/apache/spark/pull/34429#discussion_r738880189



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DescribeNamespaceExec.scala
##########
@@ -45,9 +45,13 @@ case class DescribeNamespaceExec(
 
     if (isExtended) {
       val properties = metadata.asScala -- CatalogV2Util.NAMESPACE_RESERVED_PROPERTIES
-      if (properties.nonEmpty) {
-        rows += toCatalystRow("Properties", properties.toSeq.mkString("(", ",", ")"))
-      }
+      val propertiesStr =
+        if (properties.isEmpty) {
+          ""
+        } else {
+          properties.toSeq.mkString("(", ", ", ")")
+        }
+      rows += toCatalystRow("Properties", propertiesStr)

Review comment:
       @cloud-fan There are the differences between v1 and v2 command:
   1. For extended mode, v1 command prints the "Properties" row with an empty string whereas v2 command doesn't print the row if there are no properties.
   2. v1 command puts extra space after `,` among properties: e.g., `((a,b), (c,d))` in v1 vs. `((a,b),(c,d))` in v2.
   3. v1 command uses `Database Name` vs. `Namespace Name` in v2.
   
   `1)` and `2)` are easy to resolve (e.g., just follow v1 behavior), but I wasn't sure the best way to address `3)`. Do we need to unify this property name?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org