You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/08/09 18:08:31 UTC

[GitHub] [spark] xuanyuanking commented on a change in pull request #25390: [SPARK-28662] [SQL] Create Hive Partitioned Table without specifying data type for partition columns will success in Spark 3.0

xuanyuanking commented on a change in pull request #25390: [SPARK-28662] [SQL] Create Hive Partitioned Table without specifying data type for partition columns will success in Spark 3.0
URL: https://github.com/apache/spark/pull/25390#discussion_r312592510
 
 

 ##########
 File path: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala
 ##########
 @@ -985,7 +985,15 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder(conf) {
         } else {
           CreateTable(tableDescWithPartitionColNames, mode, Some(q))
         }
-      case None => CreateTable(tableDesc, mode, None)
+      case None =>
 
 Review comment:
   Thanks for your investigation, Hao!
   I think here to throw the exception for partition column type missing is the right behavior.
   The current behavior should be the regression bug involves from #23376, it droped the partition column without type:
   ```
   spark-sql> CREATE TABLE tbl(a int) PARTITIONED BY (b) STORED AS parquet;
   Time taken: 1.856 seconds
   spark-sql> desc tbl;
   a	int	NULL
   Time taken: 0.46 seconds, Fetched 1 row(s)
   ```
   Could you also test the behavior in Hive 3.0? 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org