You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/02/10 06:31:53 UTC

[GitHub] [spark] gengliangwang commented on a change in pull request #24682: [SPARK-27838][SQL] Support user provided non-nullable avro schema for nullable catalyst schema without any null record

gengliangwang commented on a change in pull request #24682: [SPARK-27838][SQL] Support user provided non-nullable avro schema for nullable catalyst schema without any null record
URL: https://github.com/apache/spark/pull/24682#discussion_r376885875
 
 

 ##########
 File path: docs/sql-migration-guide-upgrade.md
 ##########
 @@ -132,6 +132,10 @@ license: |
 
   - Since Spark 3.0, Spark will cast `String` to `Date/TimeStamp` in binary comparisons with dates/timestamps. The previous behaviour of casting `Date/Timestamp` to `String` can be restored by setting `spark.sql.legacy.typeCoercion.datetimeToString` to `true`.
 
+  - Since Spark 3.0, when Avro files are written with user provided schema, the fields will be matched by field names between catalyst schema and avro schema instead of positions.
+
+  - Since Spark 3.0, when Avro files are written with user provided non-nullable schema, even the catalyst schema is nullable, Spark is still able to write the files. However, Spark will throw runtime NPE if any of the records contains null.
 
 Review comment:
   @cloud-fan before this PR, there should be an error in the `resolveNullableType` if user specifies a non-nullable Avro schema to write a nullable dataframe.
   After this PR, there will be warning message and possible runtime error.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org