You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/02 07:36:33 UTC

[GitHub] [spark] cloud-fan commented on a change in pull request #31421: [SPARK-33591][SQL][FOLLOWUP] Add legacy config for recognizing null partition spec values

cloud-fan commented on a change in pull request #31421:
URL: https://github.com/apache/spark/pull/31421#discussion_r568379292



##########
File path: docs/sql-migration-guide.md
##########
@@ -41,6 +41,8 @@ license: |
 
   - In Spark 3.2, the auto-generated `Cast` (such as those added by type coercion rules) will be stripped when generating column alias names. E.g., `sql("SELECT floor(1)").columns` will be `FLOOR(1)` instead of `FLOOR(CAST(1 AS DOUBLE))`.
 
+  - In Spark 3.2, a null partition value is parsed as it is. In Spark 3.1 or earlier, it is parsed as a string literal of its text representation, e.g., string "null". To restore the legacy behavior, you can set `spark.sql.legacy.parseNullPartitionSpecAsStringLiteral` as true.

Review comment:
       Let's be more specific
   ```
   In Spark 3.2, `PARTITION(col=null)` is always parsed as a null literal in the partition spec. In Spark 3.1 or earlier,
   it is parsed as a string literal ..., if the partition column is string type. To restore the legacy behavior, ...
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org