You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/03/04 05:48:03 UTC

[GitHub] [spark] zsxwing commented on a change in pull request #31713: [SPARK-34599][SQL]Fix the issue that INSERT INTO OVERWRITE doesn't support partition columns containing dot for DSv2

zsxwing commented on a change in pull request #31713:
URL: https://github.com/apache/spark/pull/31713#discussion_r587156355



##########
File path: sql/core/src/test/scala/org/apache/spark/sql/connector/InsertIntoTests.scala
##########
@@ -477,5 +477,15 @@ trait InsertIntoSQLOnlyTests
         verifyTable(t1, spark.table(view))
       }
     }
+
+    test("SPARK-34599: InsertInto: overwrite - dot in the partition column name - static mode") {
+      import testImplicits._
+      val t1 = "tbl"
+      withTable(t1) {
+        sql(s"CREATE TABLE $t1 (`a.b` string, `c.d` string) USING $v2Format PARTITIONED BY (`a.b`)")
+        sql(s"INSERT OVERWRITE $t1 PARTITION (`a.b` = 'a') (`c.d`) VALUES('b')")

Review comment:
       > does this bug only exist when the INSERT have a column list?
   
   No. It exists in queries without the column list as well. I made this test using the column list to ensure we don't have the similar issue there.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org