You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/20 07:50:25 UTC

[GitHub] [spark] MaxGekk commented on a change in pull request #30421: [SPARK-33474][SQL] Support TypeConstructed partition spec value

MaxGekk commented on a change in pull request #30421:
URL: https://github.com/apache/spark/pull/30421#discussion_r579617798



##########
File path: docs/sql-migration-guide.md
##########
@@ -101,6 +101,8 @@ license: |
 
   - Since Spark 3.1, CHAR/CHARACTER and VARCHAR types are supported in the table schema. Table scan/insertion will respect the char/varchar semantic. If char/varchar is used in places other than table schema, an exception will be thrown (CAST is an exception that simply treats char/varchar as string like before). To restore the behavior before Spark 3.1, which treats them as STRING types and ignores a length parameter, e.g. `CHAR(4)`, you can set `spark.sql.legacy.charVarcharAsString` to `true`.
 
+  - In Spark 3.1, we support using corresponding typed literal of partition column value type as partition column value in SQL, such as if we have a partition table with partition column of date type, we can use typed date literal `date '2020-01-01'` as partition spec `PARTITION (dt = date '2020-01-01')`, it will be treated as partition column value `2020-01-01`. In Spark 3.0 the partition value will be treated as string value `date '2020-01-01'` and it's a illegal date type string value and will been converted to `__HIVE_DEFAULT_PARTITION__`.

Review comment:
       Should be `In Spark 3.2`?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org